Management expert Peter Drucker once defined leadership as “lifting a person's vision to higher sights.” Ohio has set its policy sights on loftier goals for all K-12 students in the form of more demanding expectations for what they should know and be able to do by the end of each grade en route to college and career readiness. That’s the plan, anyway.
These higher academic standards include the Common Core in math and English language arts along with new standards for science and social studies. (Together, these are known as Ohio’s New Learning Standards.) Aligning with these more rigorous expectations, the state has implemented new assessments designed to gauge whether students are meeting the academic milestones important to success after high school. In 2014-15, Ohio replaced its old state exams with the PARCC assessments and in 2015-16, the state transitioned to exams developed jointly by the American Institutes for Research (AIR) and the Ohio Department of Education.
As the state marches toward higher standards and—one hopes—stronger pupil achievement and school performance, Ohioans are also seeing changes in the way the state reports student achievement and rates its approximately 600 districts and 3,500 public schools. Consider these developments:
As the standards grow more rigorous, pupil proficiency rates have declined. As recently as 2013-14, Ohio would regularly deem more than 80 percent of its students to be “proficient” in core subjects. But these statistics vastly overstated the number of pupils who were mastering math and English content and skills. For instance, the National Assessment of Educational Progress—the “nation’s report card”—indicates that just two in five Ohio students meet its stringent standards for proficiency. According to ACT, barely one in three Buckeye pupils reaches all of its college-ready benchmarks. The Ohio Department of Higher Education’s most recent statistics find that 32 percent of college-going freshman require remediation in either math or English. But with the implementation of higher standards and new exams, the state now reports more honest proficiency statistics: in 2015-16, roughly 55 to 65 percent of students statewide met Ohio’s proficient standard depending on the grade and subject. Although these rates still overstate the fraction of students meeting a college and career ready standard, parents and taxpayers are gaining a truer picture of how many young people meet a high achievement bar.
Higher achievement standards have also meant lower school ratings, particularly on the state’s performance index. This key report card component is a measure of overall student achievement within a school and one that is closely related to proficiency rates (and, for better and worse, closely correlated with socio-economics). While lower performance index scores affect schools throughout Ohio, they create special challenges when examining the results of high-poverty urban schools. Under softer standards, a fair number of urban schools maintained a C or higher rating on this measure, but now almost all of them receive a D or F performance index rating. In 2015-16, a lamentable 94 percent of urban schools were assigned one of those low grades. (High-poverty schools also receive near-universal Ds and Fs on a couple other proficiency-based measures.) Because PI ratings yield so little differentiation, policy makers, analysts, and the media need to use extra care lest they label virtually every urban school poor performing. Student achievement is indeed low in high-poverty communities and we all want to see stronger outcomes for disadvantaged children. But by concentrating on proficiency-based measures, we risk calling some schools failures when they are actually helping their students make up academic ground.
That’s where Ohio’s “value added” rating kicks in. This measure utilizes student-level data and statistical methods to capture the growth that students make (or don’t make) regardless of where they begin on the achievement spectrum. Because value added methods focus on pupil growth instead of point-in-time snapshots of proficiency, they can break the link between demographics and schools’ outcomes as measured strictly by achievement. On value added, urban schools can and do perform as well (or as poorly) as their counterparts from posh suburbs. In the present report, we show that 22 percent of Big Eight public schools earned an A or B on the state’s value added measure in 2015-16. Given the criticism of Buckeye charter schools, it is even more notable that a greater proportion of urban charters earned A or B value added ratings than did their Big Eight[1] district counterparts (29 to 19 percent). Although the evidence is based on just one year of results, one hopes that these results represent the onset of an era of higher charter performance after major reforms were enacted in 2015.
While value added scores haven’t noticeably plummeted or inflated with the rising standards, we should point out some important developments in the measure itself. First, during Ohio’s testing transitions, the state has reported value added results based on one-year calculations rather than multi-year averages, as was done prior to 2014-15. Probably as a result, some schools’ ratings have swung significantly; for example, Dayton Public Schools received an F on value added in 2014-15 but an A in 2015-16. One year of value added results can’t perfectly capture school performance—we need to take into account a longer track record on this report card measure.
Second, Ohio’s value added system now includes high schools. Previous value added ratings were based solely on tests from grades four through eight (third grade assessments form the baseline). With the phase out of the Ohio Graduation Tests (OGT) and the transition to high school end-of-course exams, Ohio has been able to expand value added to high schools. (The OGTs were not aligned to grade-level standards, prohibiting growth calculations; EOCs are aligned to the state’s new learning standards.) Starting in 2015-16, the state assigns value added ratings at the high school level (though it reported high school results in the year prior). In the absence of value added, analysts were limited to proficiency or graduation rates that can disadvantage high-poverty high schools. With the addition of value added, we gain a richer view of high school performance.
Shifting to higher learning standards, transitioning to new tests, and evolving to more comprehensive school report cards has led to some frustration. To a certain degree, the feedback is understandable—it has been a challenging start in the long journey toward academic excellence. In the days ahead, Ohioans should absolutely continue to work together to make sure state standards and accountability policies are as rigorous, coherent, and fair as possible. At the same time, the state should ensure continuity in key policy areas so that we can gauge our progress moving forward.
At the end of the day, we should keep the big picture in mind: High standards, properly implemented, help form the foundation for greater student achievement. Several Ohio school leaders appear ready and willing to tackle these challenges. After the report card release, David Taylor, a leader at Dayton Early College Academy, told the Dayton Daily News, “We hope that people have the patience to understand that the goal posts moved…We’re asking a lot more of our kids and their families. That will require patience and a plan.” On the pages of the same newspaper, Scott Inskeep, superintendent of Kettering City Schools, said, “The AIR assessments were tough…We have to get tough, ourselves, and teach to the depth that is needed to assure student success on these tests.” Ohio has charted a more rugged course for its students and schools. If state and local leaders can maintain this course—setting sights on excellence—we should begin to see more young people fully prepared to face the challenges of tomorrow.
Download the full report here.
[1] The Big Eight cities are Akron, Canton, Cincinnati, Cleveland, Columbus, Dayton, Toledo, and Youngstown.