Skip Navigation


July-August 2011

ResizeResize Text: Original Large XLarge Untitled Document Subscribe

Crossing Boundaries: Postsecondary Readiness Assessment in Florida

No one, least of all educators, would dispute that scholarly collaboration by all players in the educational pipeline is key to the seamless, successful progression of students from pre-K to university study. In reality, mutually defining the learning expectations common to K-12 and higher education often devolves into finger-pointing and not-so-veiled criticism. Fueled by competition for scarce public resources, disparate challenges, and seemingly disconnected goals, public schools and the colleges and universities they feed students into frequently appear more adversarial than collegial.

These challenges notwithstanding, during the past two years, over 100 faculty from five school districts, two non-public postsecondary institutions, nine state universities, and 24 Florida state colleges worked together without special compensation on a project to improve college readiness.

Florida's path to a new, customized placement test is the result of a statewide movement toward a common definition of college and career readiness that pinpoints specifically what students need to know and the competencies they need to have to succeed in their first college-level English and math classes. The process, which initially relied upon faculty's subject-matter expertise, soon became a classic example of success nurtured by professional respect, dedication to a common goal, and ongoing communication between college faculty and their K-12 counterparts.

Section 1008.30 of the Florida statutes, first implemented in 1992, required the State Board of Education to develop and implement a common placement-testing program to determine the readiness of students to enter a degree program at any public college or university. But it wasn't until October 2010 that the Florida Department of Education's Division of Florida Colleges rolled out one of the first customized college placement tests developed from a blueprint created by a team of K-12 and college faculty.

The process that eventuated in that test began in April 2008, when, on the recommendation of the Go Higher Florida! task force and with the support of Commissioner Eric Smith, Florida joined Achieve's American Diploma Project network. In September 2008, as an initial step in aligning high school exit and college entry expectations, the Division of Florida Colleges organized a workshop for English/language arts and mathematics K-20 faculty from across the state.

The participants divided themselves into groups by subject area and reviewed the American Diploma Project (ADP) benchmarks, identifying from among them the competencies they deemed critical to college readiness in entry-level math and English and locating the gaps between academic preparation in the schools and postsecondary expectations. This cross-sector endeavor resulted in Florida's Postsecondary Readiness Competencies (PRCs), which were subsequently aligned with the K-12 Sunshine State Standards through a joint effort between the department and Achieve. The first steps to a less leaky pipeline were in place.

The Postsecondary Readiness Competencies then drove the specifications for a new statewide college placement test. Members of the original faculty pool that identified the PRCs reconvened in April 2009 to select or develop exemplar test items for each of the English/language arts and math competencies.

Building upon their previous experience, these faculty experts chose items that covered the breadth and depth of each competency, ensuring that the items chosen were representative of the knowledge that incoming college freshmen needed to possess in order to be successful in entry-level college courses without the need for remediation. The PRCs and the exemplar items were included in information provided to potential vendors, who were invited to submit sample test questions based upon the work of the faculty.

All submitted proposals were reviewed by three separate review committees: a content review team, a technical review team, and a negotiation team. Once again, faculty from the original working group that had identified the critical competencies and developed the exemplars were called upon to staff the content review team. They were charged with reviewing and rating the vendor's sample questions for item alignment and item quality.

Ultimately, when ratings from all three review teams were combined, it was clear that the most well-aligned, high-quality items proved the tipping point for the winning vendor. In January 2010, another step in the collaborative test development process was completed with the award of a test-development contract to McCann Associates, a seasoned “behind-the-scenes” assessment provider experienced in working with both the College Board and ACT.

But then, just as the creation of a new customized college placement test fully aligned to faculty-identified competencies seemed on the near horizon, the Florida Department of Education announced its support of and intention to adopt the newly released national Common Core Standards. [Editor's note: see Kati Haycock's article on the Common Core Standards in the July/August 2010 issue of Change.] Not to be deterred from its goal of a meaningfully aligned assessment, the Division of Florida Colleges' staff called the seasoned faculty teams once more to action.

This time the cross-sector faculty aligned the previously identified ADP benchmarks with the draft Common Core College and Career Readiness standards, as well as the Basic Skills Exit Test (the test administered to students at the end of the highest level of developmental education). The work resulted in a revised set of PRCs, which the vendor then used to begin actual test-item development.

In June 2010, Florida colleges administered over 10,000 pilot exams populated with the newly created test items. The data from the pilot was then used to build the Postsecondary Education Readiness Test (P.E.R.T.) item bank. The faculty team came to Tallahassee again in August 2010 to review the entire item bank, analyze each question for alignment with the competencies and overall quality, and make adjustments as needed.

Following the August review, feedback made it clear that there were not enough operational items approved by the faculty to launch a computer-adaptive test immediately. Underscoring the importance of the collaborative process, as vice chancellor I put a halt to the launch of a computer-adaptive test until the item bank was adequate. With the concurrence of the vendor, the Department authorized the launch of a linear test that included additional field-test items. Once the operational item pool is large enough to support it, the vendor will make the transition to a computer-adaptive test.

On October 25, 2010, the P.E.R.T. went live. Each college in the system is now in the process of transitioning to it as their primary placement assessment, and all interested faculty are being given the opportunity to take the test in a proctored setting during the initial rollout phase. The K-20 faculty who worked diligently with the Division of Florida Colleges over the past two years continue to serve as content experts. They are currently reviewing all new test items before they are released for use in the assessment, as well as working on a new bank of items for the soon-to-be launched diagnostic component of the P.E.R.T. The diagnostic will be used for students scoring below college-ready on the basic assessment in order to identify specific academic deficiencies. With this information, faculty members can customize instruction, and course curricula can be modularized for time and cost-efficiencies.

Some of the same faculty members are also working on using the PRCs as the basis for innovative redesign of the developmental education curriculum—but that's another story for another time.

The Postsecondary Readiness Competencies and Test Items: Examples

Postsecondary Readiness Competency: Writing

Demonstrate control of standard English through the use of grammar, punctuation, capitalization, and spelling.

DOE Exemplar Item

DIRECTIONS: Choose the sentence that is correctly punctuated.

  1. The students grades weren't posted, therefore, no one knew who made the highest score.

  2. The students' grades werent posted, therefore, no one knew who made the highest score.

  3. The students' grades weren't posted; therefore, no one knew who made the highest score.

  4. The students' grades weren't posted, therefore no one knew who made the highest score.

Postsecondary Readiness Competency: Reading

Use context to determine the meaning of unfamiliar words.

DOE Exemplar Item

DIRECTIONS: Read the following sentence and answer the question.

The warmth of the sun raised the water temperature enough to awaken the rainbow and cutthroat trout that slumbered, and the caddis flies were dancing their erratic dance, here and there, over the water.

What does the word erratic mean as used in the sentence above?

  1. aimless

  2. graceful

  3. leisurely

  4. swift

Postsecondary Readiness Competency: Math

DIRECTIONS: Add, subtract, and multiply polynomials.

DOE Exemplar Item


(5x2 - 6x - 3) - (2x2 - 2x + 1)

  1. 3x2 - 8x - 2

  2. 3x2 - 4x - 4

  3. 3x2 - 4x - 2

  4. 3x4 - 4x2 – 4]

Insomuch as other states have asked about the process used by Florida to develop a customized postsecondary placement assessment, John Hughes, associate vice chancellor for evaluation, and Julie Alexander, associate vice chancellor for learning initiatives—both with the Division of Florida Colleges—have developed the following step-by step guide to assist others in understanding and possibly replicating our approach.

  1. Develop a core team of cross-sector state-level administrators to coordinate the activities.

  2. Identify a cross-sector team of faculty who are willing and able to participate in an ongoing effort for two or more years.

  3. Have the faculty identify competencies for courses from developmental education through the first level of college-credit courses. The American Diploma Project, the Common Core Standards, and any previously developed state standards may be used as reference points, but faculty should not be constrained by them. Allow faculty to develop competencies that match their own expectations of entering students.

  4. Have faculty rank the competencies in order of importance. For example, in mathematics, quadratic equations may be more important than rational expressions, although both skills are expected. This is necessary because with the limited number of items on a test, not every competency can be fully measured.

  5. The state-level core team should determine the number of items on the test and whether a linear or computer-adaptive test is preferable. A computer-adaptive test can be shorter but requires more total items in the test bank.

  6. Once the length of the test is determined, have the state-level core team select the number of items that will be used to measure each competency. For highly ranked competencies, a larger number of items are needed; the reverse is true for lower-ranked competencies. This will produce a test blueprint that specifies the number of items for each competency that will appear on the test.

  7. Develop the item pools based on the test blueprint. This can be done directly by the vendor or collaboratively by the vendor and the faculty team, although we have found that the latter approach is preferable and saves time in editing down the road.

  8. Allow the faculty an opportunity to review the quality of the test, align each item with a competency, and recommend revisions. Remove items that the faculty do not approve.

  9. Administer a pilot to students who are similar demographically and academically to those who would normally be tested. Analyze the pilot results, remove the items that did not perform well, and then construct the test.

  10. Select the criteria for determining the interim cut scores. If the goal is to match the current placement rates, the state-level core team can work with current course enrollment in developmental education. If the goal is different, the state-level core team will need to develop alternative criteria.

  11. Select interim cut scores based on the criteria and the expected distribution of the test.

  12. Launch the placement assessment.

  13. Assess course outcomes and reevaluate the interim scores.

In the development and implementation of the P.E.R.T., there have been benefits on multiple fronts. First, it is no secret that the various educational sectors speak different languages and even occasionally distrust each others' motives. But continuous, long-term engagement and re-engagement on an issue of mutual interest and benefit—i.e., college readiness—proved to be a real-time demonstration that it is possible for professionals with different foci to cooperate for the greater good.

In terms of tangible deliverables, Florida has a new assessment; in terms of buy-in and respect, and for the sake of future projects, we now have a cadre of individuals who have worked long hours together, heartily debated academic issues, shared family and pet stories, and consumed the ubiquitous boxed lunches together. Hopefully, the good vibes and joint feelings of accomplishment will be carried back to schools, colleges, and universities throughout the state, where, with luck, they'll have a positive impact on future secondary/postsecondary relationships and projects.

As for the P.E.R.T. itself, this cooperative initiative has given us a first—an assessment based upon a broadly vetted and aligned set of college-readiness competencies that should more accurately measure essential skills for postsecondary success.

The long road to the development and release of Florida's new postsecondary placement assessment has already at times left the travelers along it exhausted, impatient, and prey to angst—and we still have miles to go. But given the choice between taking the easiest route—buying an off-the-shelf assessment—or moving forward to provide the best that we could for our students, we chose the latter route. We ask our students to give their best, so it makes sense that we do no less to support their educational success.

Postscript: The developmental-education curriculum is now available to all high schools in Florida, so students have the opportunity to tackle and ensure mastery of necessary competencies prior to graduation. And since last summer, numerous professional-development workshops pairing college and high school math and language-arts faculty have been offered throughout the state. Textbooks, syllabi, and teaching techniques are being shared in this ongoing effort to maximize opportunities for students.

We don't purport to have all of the answers; in fact, we probably have not yet identified all of the questions about how best to prepare students for education and careers after high school. One thing we do know, though, is that it takes a focus on possibilities, scores of dedicated and flexible individuals, dogged persistence, and many boxed lunches to even begin to make meaningful change.

Judith Bilsky ( is the chief academic and student affairs officer for the 28 colleges in the Florida College System, which serves almost 900,000 students annually. Her career spans over 30 years as a faculty member and administrator at all levels of education, from pre-K through graduate school.

In this Issue

On this Topic


© 2016 Taylor & Francis Group · 530 Walnut Street, Suite 850, Philadelphia, PA · 19106