School turnaround policy for Ohio districts, including Youngstown and Lorain, has attracted tremendous attention in recent months.
School turnaround policy for Ohio districts, including Youngstown and Lorain, has attracted tremendous attention in recent months. While thatremains unsettled, another more positive turnaround story continues to unfold with the state’s public charter schools.
It’s no secret that Ohio charters have had a checkered track record since the first schools opened twenty years ago. Seeking to elevate quality, Ohio legislators enacted sweeping reforms—the most significant occurring in—that heighten accountability and ensure more responsible practices among schools and their sponsors (a.k.a. “authorizers”). Although these reforms are still relatively fresh, a rigorous released in February by CREDO points to among brick-and-mortar charters (their analysis ended with 2016–17 data).
Results from 2018–19 state report cards offer more encouraging signs that Ohio’s charter sector is righting itself in the wake of reform. It’s true that these school-level data cannot match the depth and rigor of CREDO, but they add supporting evidence about the sector’s recovery. (And stay tuned later this fall for a more in-depth look at results in our annual report-card review.)
Ohio’s charter schools are overwhelmingly clustered in the high-poverty urban districts known as the “Big Eight.” Of the state’s charters receiving conventional A–F ratings in 2018–19, 78 percent were located in these eight districts, with the remainder mostly situated in inner-ring suburbs and a few statewide e-schools. Because of their heavy concentration in the Big Eight, we at Fordham have long focused on the performance of charters in relation to district schools located in these cities. In our view, this is a reasonably fair comparison of schools serving children from similar backgrounds and one that explores whether charters offer superior public-school options relative to a student’s main alternative.
One of Ohio’s key performance indicators is its “value-added” measure that gauges the academic progress of students over time. Because the measure examines growth, rather than overall test scores, high-performing, high-poverty schools can and do earn top marks on value added. In contrast, schools serving predominately low-income students tend to struggle on proficiency measures due in part to well-documented achievement gaps. Sadly, that pattern continued in 2018–19, as the vast majority of Big Eight district and charter schools were assigned D’s and F’s on themeasure of proficiency (88 and 92 percent, respectively).
On the value-added measure, on the other hand, a higher percentage of Big Eight charter schools have earned A’s or B’s than their district counterparts. In 2018–19, a full third of charters received such ratings in comparison to 26 percent of Big Eight district schools. Conversely, a smaller fraction of charters received F’s on this measure (44 versus 58 percent). In the two years prior, Big Eight charters also outperformed their district peers by roughly similar margins.
Figure 1: Overall value-added ratings for Big Eight charter and district schools
Sources: Ohio Department of Education, Download Data (2018–19), and for prior years, my analyses of school ratings and . Note: In 2018–19, the number of Big Eight district schools was 411 and the number of Big Eight charter schools was 180. Both sectors had largely similar numbers of schools in the two prior years.
The figure above, however, doesn’t fully capture the improvement among Big Eight charter schools between 2017–18 and 2018–19. In fact, we see some slippage in the percentage of A-rated charters compared to the year prior, with an uptick of B’s. How can this be evidence of improvement? This shift is actually due to a small tweak in the state’s grading rules. Under state , schools’ value-added ratings are “demoted” based on low value added ratings (e.g., if a school received an F for students with disabilities). In prior years, Ohio implemented this rule by decreasing the component rating that combines the overall and subgroup results. But starting in 2018–19, the state began demoting the overall value-added rating, leading a few otherwise A-rated schools to receive B’s on this measure (the rule only affects those receiving a “preliminary A”).
Figure 2, therefore, shows the distribution of overall value-added ratings had the state had maintained the same grading rules for 2018–19. Big Eight charter schools still outperform their district counterparts. But unlike the figure above, an improvement at the top end of the distribution is now visible. If consistent rules had been applied, 24 percent of charters would’ve received an “A” compared to 20 percent in the year prior. Also notable is the jump in the fraction of Big Eight district schools receiving A’s—19 versus 13 percent in the year prior—perhaps consistent with the upticks in performance indexin several Big Eight districts.
Figure 2: Overall value-added ratings for Big Eight charter and district schools (modified results for 2018–19)
Note: This chart displays unofficial value-added ratings for 2018–19 had the state applied the same subgroup “demotion” rule as in prior years.
* * *
Nearly everyone agrees that improving the educational outcomes of Ohio’s neediest children should be a top priority. But how to get there remains subject to great debate. Rigorousfrom cities across the nation continue to find that quality charter schools deliver strong academic gains for less advantaged students. Ohio has not traditionally been a hotbed of high-performing charters, but things are changing for the good—and state report card results show it. Great charter organizations like in Cleveland, and in Columbus, and in Dayton continue to expand and serve more students. Thanks to a recent increase in supplemental charter , further growth among quality charters is on the horizon.
Yet the numbers of low ratings, even on the more poverty-neutral value-added measure, across both Big Eight district and charter schools remind us of the long road ahead. No one can yet claim that all Ohio students have access to a world-class education. The good news, however, is that progress is underway. To maintain, or even accelerate, the momentum, policymakers need to keep challenging all schools—both district and charter—to help their students reach their full potential.
 The Big Eight are Akron, Canton, Cincinnati, Cleveland, Columbus, Dayton, Toledo, and Youngstown.
 Of note, this does not refer to the more significant to the value-added rating system enacted via House Bill 166—the state budget bill—which passed in July 2019. Due to the October 2019 effective date of the legislation, those changes will be seen in September 2020 with the 2019–20 report cards.
When Governor DeWine signed theinto law in mid-July, it marked the end of over Ohio’s graduation requirements. The new set of standards—based on a offered by , the , and Fordham—will go into full effect .
It’s good news for Ohio families, businesses, and taxpayers that students are working toward a diploma that means something. But that hasn’t always been the case. The classes of 2018 and 2019 graduated under a set of weakened alternatives thatsoft, non-academic measures such as work or community service experience and regular attendance during senior year. As a result, they are Ohio’s only graduating classes in twenty-five years to earn a diploma without demonstrating a baseline level of objective academic competence on state exams.
This spring, I wrote athat identified the twenty districts with the highest percentage of students using the weakened alternatives to graduate. All of them reported that at least a fourth of their 2018 class did so. In my analysis, I noted that many of these districts could register higher graduation rates and higher grades on their state report card as a result. Now that 2018–19 report cards have been released, it’s possible to determine whether they did, in fact, get such a boost.
Remember, the class of 2017—like every class before it for over a decade—was required to pass all five of thein order to graduate. These were state exams that measured basic competency in reading, writing, math, science, and social studies. Students in the class of 2018, on the other hand, were able to graduate on state tests.
As you can see, a majority of the identified districts fared better on four-year graduation rate grades in 2018 compared to the year prior. Twelve of the twenty districts received a higher letter grade. Only one district—New Boston Local—received a lower grade, and that was only to drop from an A to a B.
Several of these districts earned significantly higher grades than in years past. Maple Heights, which had a whopping 55 percent of students use the weakened alternatives to graduate in 2018, saw its four-year grade jump from a D to a B. Youngstown, meanwhile, had 48 percent of its graduating class use the weakened alternatives. It earned a C for its four-year grade, the first time the district has earned higher than an F, the first year in which A-F ratings were assigned.
Based on the data above, lowering graduation standards has artificially boosted the grades of several districts. More students in the class of 2018 did in fact graduate in these places, but it’s likely that the increases are only due to adults lowering the bar. It’s also important to remember that increases in graduation grades affect a district’s, which in turn impacts the scholarship program, where charter schools are , and which districts are placed under . Considering of failing to tell the , sudden increases in graduation grades are not great news. Fortunately, lawmakers have already put a much more rigorous set of requirements in place. Let’s hope that this time, they stick.
Gallons of ink, some on this blog, have been spilled about what Ohio should do about academically troubled school districts. Put forward in the Senate Education Committee,would overhaul the current turnaround model by replacing academic distress commissions (ADC) enacted via in summer 2015 with a softer, more patient form of state intervention.
ADC critics repeatedly argue that they “don’t work,” cause disruption and demoralization, and have generally made bad situations worse. Unfortunately, no evaluation of the academic impact of ADC policy exists, so these claims often go unchallenged. Yet with the release of the 2018–19 school report cards, we can begin to see high-level trends that might indicate whether the most strident criticisms are merited, or whether we actually see signs of increased student achievement.
Intervention in Youngstown predates the HB 70 revisions to ADC law, as the district first came under the thumb of the state in. Indeed, the district’s lack of progress under Ohio’s old (and meek) turnaround framework prompted legislators to adopt a more muscular approach. In , Krish Mohip took the helm as the district’s first CEO, a leadership position called for in HB 70. Though likely a transition year, 2016–17 is thus a logical baseline to begin tracking performance under Youngstown’s ADC.
The table below displays data compiled over the past five years on two key measures of performance on state exams: 1) value added, a measure of student growth over time that doesn’t strongly correlate with demographics, and 2) performance index, a composite measure of student proficiency across grades and subjects.
Table 1 shows that academic performance was bleak in Youngstown in the year prior to the ADC, and progress since then has been slow. The districtwide value-added rating hasn’t budged, nor do we observe higher percentages of students attending “A” rated schools on this measure. Kudos to, the only school to accomplish this feat since 2014–15—four years running. Overall, Youngstown students still struggle mightily on state exams, and too few attend the high-performing schools needed to change these trajectories. That being said, the results registered after ADC implementation are not appreciably worse than in the year immediately prior.
Table 1: Key improvement indicators for Youngstown School District
* These are unofficial ratings based on one-year value-added index scores available via the “” tables. (ODE began incorporating one-year value-added scores into the Gap Closing component in FY 2018.) Ohio’s official value-added ratings, used in the District Value-Added Rating and the Schools Rated “A” rows, are based on index scores averaged over three years. While multi-year scores improve the stability of ratings from year to year, single-year scores can provide useful information in turnaround situations. The statewide performance index score was 81.6 in 2015–16 and 84.7 in 2018–19. †Ohio used different state assessments in 2014–15 than in the more recent years; the statewide average PI score that year was 84.3.
Akin to Youngstown, Lorain was also under statefor low academic performance prior to the 2015 reforms. After the new ADC law passed, David Hardy was hired as the district CEO in . His first year, 2017–18, was likely a transition year, but we can now track the performance of Lorain for two years after ADC implementation.
As might be expected in a district under state oversight, Table 2 shows that Lorain posted poor results before ADC intervention. In the two years immediately prior, the district received an “F” value-added rating and no students attended an “A” rated school according to this measure. Lorain’s overall value-added rating has remained depressed in the two most recent years, though improvements are visible when focusing on the growth data from only 2018–19. In both core subject areas, the district’s single-year value-added index scores were equivalent to “C” ratings. Moreover, one Lorain school, Steven Dohanos Elementary School, posted an “A” value-added rating last year, the only school to accomplish that feat during this period. The district’s performance index scores have also risen over the past two years, yet another promising sign of improvement. Last, though not displayed in Table 2, the district overall rating changed from an “F” to “D” in 2018–19.
Table 2: Key improvement indicators for Lorain School District
* For more on these data points, see the notes under Table 1.
East Cleveland is the third district currently overseen by an ADC. Its commission was established halfway through the 2018–19 school year, with CEO Henry Pettiegrewin February 2019. Thus, we cannot yet track performance under an ADC model.
Districts on “ADC watch”
ADCs not only seek to turn around chronically-low-performing districts but they may also spur improvements among other struggling districts by the prospect of intervention. Under the HB 70 framework, districts earning three consecutive overall “F” ratings fall into state receivership via ADC. Last year, districts were on “ADC watch” by merit of a failing rating. Of these districts, just one, Dayton, had received two such ratings.
Anecdotalindicates that at-risk districts took measures to avoid intervention by working to improve their ratings. The results shown in Table 3 suggest that the threat of intervention may have led to higher student achievement. In all but one of these districts, performance index gains exceeded the statewide average, which includes the vast majority of districts not in jeopardy of intervention. Ashtabula and Mansfield, for instance, registered strong improvements—+4.3 and +2.4 points respectively—as did big-city districts such as Columbus (+1.7) and Dayton (+1.6). While these results aren’t conclusive, they at least hint at the possibility of the positive effects of accountability pressures.
Table 3: Districts under “ADC watch” in 2017–18
Because they call for bold changes in district governance and leadership, ADCs have left many unhappy. Appeasing dissatisfied interest groups might be politically expedient, but legislators should also consider whether it’s fair to students to pull the rug out from beneath them before promising reforms can take hold. The early results from Youngstown and Lorain show that ADCs aren’t the academic catastrophe that the harshest critics claim. Nor are they an overnight miracle-cure. In the end, we must remember that improving performance in long-dysfunctional districts takes patience and resolve, qualities that seem to be lacking in the debate around ADCs.
Update (10/9/19): The unofficial value-added ratings based on one-year math and ELA data from the Gap Closing component and reported in the tables above reflect just grades 4-8. Since the publication of this post, ODE has released one-year scores based on all grades-subjects with value-added data (including also high-school EOCs and science). Had data from all state exams been used to calculate one-year value-added scores, both Lorain and Youngstown would have received an equivalent of an "F."
- Aaron Churchill
 Cleveland school district also received an overall “F” rating in 2017–18 but was not under threat of an ADC, as it already has an alternative governance structure and is undertaking an improvement plan.
NOTE: The Thomas B. Fordham Institute occasionally publishes guest commentaries on its blogs. The views expressed by guest authors do not necessarily reflect those of Fordham.
This may sound counter-intuitive, but here it is: Technically speaking, Ohio school districts do not contribute toward Ohio teacher pension benefits.
"How is this possible?" you might ask. After all, Ohio school districts are contributing 14 percent of each teacher's salary into the pension fund.
But wait, where is that contribution going? If you pull up the latest actuarial valuation report from the State Teachers Retirement System of Ohio, you can find out. Table I-1 shows that the plan estimates the "normal cost" of the benefits are worth 10.91 percent of salary. That is, across all individuals who enter the plan, after accounting for their age or how long they might stay, the plan thinks the promised pension benefits are worth an average of 10.91 percent of each teacher's salary.
You'll quickly start to screw up your face, especially if you know that every teacher is currently contributing 14 percent of their salary into the plan. Fourteen is more than 10.91 percent, how can that be?
This is due to the fact that Ohio STRS has accumulated unfunded liabilities of $24.8 billion. Every single STRS member is contributing 3.09 percent of their salary (14 percent - 10.91 percent) to pay off that debt.
That's not all. In addition to the employee contributions, school districts are also paying in 14 percent of each teacher's salary into STRS. That money is going into the plan, but none of that is going toward benefits. All of it is going to being used to pay down the unfunded liabilities.
In essence, Ohio has created a system where teachers, on average, are getting less out of their pension plan than they themselves put in. To be honest, it's hard to even call this a "retirement" system at all. The system is functioning like a debt accumulation tool and a tax on teachers, with retirement benefits on the side.
Again, this may be sort of hard to wrap your head around, but it's true. The figures above are all based on what the state's actuaries think the Ohio STRS plan will cost over time. Ohio is the only state in such a bad situation overall, but Illinois teachers hired as of 2011 are also paying more into the system, on average, than the state's pension plan thinks their benefits are worth. Other states may be in similar territory for new, less-generous benefit tiers, but they rarely report those data separately.
In contrast, Ohio also offers new teachers the option to join a defined contribution plan with a 9.53 percent employer match. For the vast majority of teachers, that's likely to be the better option.
This piece was originally published in a slightly different form on the Teacher Pensions Blog.