Proficient (adj.): “Well advanced in an art, occupation, or branch of knowledge.” —
Proficiency on Ohio state exams has long had little to do with being “well advanced.” In, 66 percent of Ohio students met state proficiency standards in eighth grade reading, while just 39 percent were proficient in eighth-grade reading on the more stringent National Assessment of Educational Progress (NAEP). Other with more rigorous standards—e.g., Colorado, Florida, and Massachusetts—report proficiency rates on state tests that are approximately in-line with these national exams. In fact, Ohio’s proficiency standards are so relaxed that the state cautions that meeting this mark doesn’t indicate being on-track for college and career success (reaching “ ”—a level above proficient—does).
Proficiency standards aren’t just a wonky topic. They have real-world consequences. When students are told they are “proficient”—though they fail to meet rigorous academic targets—they may be misled into believing that they’re on a solid pathway to college. Potential costs include the following: Misinformed students could begin coasting through their coursework when they should be pushing themselves to reach higher academic goals; they might begin planning for admissions to college, only to be feel regrets when they can’t get in; and they might skip opportunities that can prepare them for rewarding careers that don’t require four-year degrees.
These are all risks associated with setting soft proficiency standards. But how often does this situation happen? An insightful analysis by the Ohio Department of Education (ODE) shared at last week’s State Board of Education meeting indicates that it occurs all too often. Consider the following charts, drawn from the agency’s draft report that depict the relationship between state end-of-course (EOC) and ACT exam scores. The black dots represent individual student’s scores on the corresponding subject-area tests. Note that these exams were taken by a large majority of students in the class of 2018, the cohort represented in the figures. The EOCs shown here are typically taken during pupils’ sophomore years and the ACT as juniors.
Figure 1: The relationship between Ohio students’ EOC and ACT scores, class of 2018
Note: I modify charts presented in ODE’s report to display the thresholds—the red lines—needed to achieve proficient on the EOCs (a on both exams) and to achieve college scores on the ACT (a score of 18 in English and 22 in math).
Three things jump out from these charts:
- First, we see a positive relationship between the exam scores. As indicated by the upward-sloping blue lines on both charts, students who perform well on state EOCs tend to perform well on the ACT. Though not a perfect, one-to-one correlation, the results remind us that achievement on state exams matters, as they are predictors of performance on college entrance exams.
- Second, in both subject areas, many high school students are being deemed proficient who do not reach college remediation-free levels on the ACT. This situation is depicted in the bottom right quadrant of the charts where a heavy concentration of dots exists. While ODE’s report doesn’t provide exact numbers, we can see visually that a substantial number of students are deemed proficient on the high-school EOCs—and likely satisfied with their achievement—who don’t reach ACT scores that predict college-level success.
- Third, as depicted in the upper-left quadrant, we see that some students fall short of EOC proficiency but achieve college-ready scores on the ACT. In some cases, it’s possible that disappointing state exam results may have been the wake-up call needed for them to achieve higher scores on the ACT. Thus, there may in fact be a benefit to this type of “misclassification,” rather than the substantial risks involved when over-identifying students as proficient. Moreover, there isn’t a negative impact on these students in terms of graduation as Ohio recognizes remediation free achievement on college entrance exams if students struggle on EOCs.
Ohio policymakers shouldn’t lead students into believing they are on the pathway to college success when they’re not. There are several options, some that I’ve discussed  Another option is to overhaul the classification system and start fresh. For instance, Ohio could adopt categories such as “approaching college-ready expectations” or “meeting college-ready expectations.”, that could resolve the dilemma. Policymakers could eliminate the accelerated category and make proficient the second-highest achievement level, thus aligning proficient more closely with college ready benchmarks.
Regardless of which approach is adopted, policymakers should make clear that these more stringent targets aren’t the high school graduation standard—that should be set somewhat lower than college-ready—and they could also stop using straight-up proficiency rates in its schoolsystem (relying instead on the for accountability purposes).
Many, maybe most, Ohio high school students still. They deserve the truth about whether they’re on pace to achieve their post-secondary goals. Unfortunately, when it comes to state exam results, the signals seem to be getting crossed, as too many students are being told they’re proficient—suggesting “on track” for college or even “well advanced” in their studies—when they aren’t. In communicating state test results to parents and students, remains the best policy.
 A small portion of Ohio students likely took only the SAT exam and a minority of students took the state integrated math II EOC instead of geometry.
 This would significantly reduce the number of students being told they are proficient but not college ready, but it would also deem more students as not proficient who meet college-ready ACT targets. In my view, the costs of the latter type of misclassification are lower than those associated with errors in the other direction.