Bad schools rarely die. This was the conclusion of Fordham’s 2010 report Are Bad Schools Immortal?, which discovered that out of two thousand low-performing schools across ten states, only 10 percent actually closed over a five-year period. On reflection, the finding was not too surprising: Shuttering schools nearly always sets off a torrent of political backlash, as authorities in Chicago, Philadelphia, and other urban districts have learned in recent years. And the reasons are understandable: Schools are integral parts of communities. They’re built into families’ routines and expectations, and closing them inevitably causes pain, disruption, and sadness, even when it’s best for students.
However, we also recognize that closing schools is sometimes necessary. In the charter sector, in particular, closure is an essential part of the model: Schools are supposed to perform or lose their contracts. That’s the bargain. And in the district sector, experience has taught us that some schools have been so dysfunctional, for so long, that efforts to “turn them around” are virtually destined to fail.
That doesn’t mean it’s easy to put bad schools out of their misery. Part of the difficulty is political, but it’s also a genuine moral dilemma: Are we sure that kids will be better off after their schools close? What is the quality of the remaining schools in their neighborhoods? Most importantly, do students gain or lose ground academically when their schools close and they are obliged to enroll somewhere else?
We know from personal experience how important, even agonizing, these questions are. In our role as a charter school authorizer in Ohio, we have blinked on a few occasions—choosing to keep marginal schools open because we worried that the children attending them might be even worse off if they had to move elsewhere. Were we right to do so?
To date, policymakers and practitioners have had precious little research to anchor their thinking and inform their decision making. We could only locate three relevant studies, and their conclusions differed on whether closures positively or negatively affected students.
The high stakes associated with school closures, and the paucity of prior research, led us to explore this terrain ourselves. The result is Fordham’s new study School Closures and Student Achievement: An Analysis of Ohio’s Urban Districts and Charter Schools, which brings to bear fresh empirical evidence on this critical issue. As it turns out, our home state of Ohio is fertile ground. Its large urban districts, referred to as the “Big Eight,” have faced sharply declining enrollment due to both shrinking populations and an influx of charter schools. Confronting the loss of more than fifty thousand pupils in just eight years, these districts have been forced to close scores of schools.
During the same period, dozens of charter schools have also closed for a variety of reasons, including financial difficulties and academic underperformance. In fact, Ohio’s automatic closure law, which is based on academic results, required twenty-three charters to close during the period of study.
Our study examined the achievement trends of 22,722 students in grades 3–8 who attended one of the 198 urban schools in Ohio that shut their doors between 2006 and 2012. These closures disproportionately affected low-income, low-achieving, and black students. To our knowledge, this is the first study to investigate separately the academic impact of closing charter schools. The study was conducted by Dr. Stéphane Lavertu of the Ohio State University and Dr. Deven Carlson of the University of Oklahoma, who used state records to examine the impact of closure.
Their most important finding is that school closure has significant positive impacts on the achievement of displaced students. The following figure displays the cumulative learning-gain estimates of displaced students by the third year after their schools closed. Displaced students from district schools that closed in urban areas gained, on average, forty-nine extra days of learning in reading relative to the comparison group; in math, it was thirty-four days. In the charter sector, students displaced from a closed school also made substantial gains in math—forty-six additional days—but did not make statistically significant gains in reading.
Figure 1: Impact of closure on displaced students, measured as cumulative student learning gains by the third year after closure
The analysts then focused on charter and district students who landed in higher-quality schools after closure, and there they found even larger cumulative learning gains. (We defined quality as a school’s contributions to student growth—its “value added,” in education parlance.) District students who landed in higher-quality schools gained an equivalent of sixty-nine extra days of learning in reading and sixty-three extra days of learning in math. When charter students moved to higher-quality schools, they gained an additional fifty-eight days of learning in reading and eighty-eight days of learning in math by the third year after their school closed.
We must register one caveat that tempers the positive findings on closures: When students displaced by closures enter their new schools, it is possible that they negatively impact the learning of students who had previously attended the school. Think of this as a possible “side effect” of the closure “treatment.” The study provides suggestive (but not conclusive) evidence that there might be minor side effects—the value-added scores of schools absorbing displaced students fall slightly. The net effect of closure remains an open empirical question.
These findings have two implications for policymakers. First, they should not shy away from closures as one way to improve urban education; they are a viable alternative to “turnarounds.” As Andy Smarick and others have argued, fixing a chronically low-performing school is often more wishful thinking than promising strategy. Although successful school turnarounds are not impossible, Smarick is correct when he writes, “Today’s fixation with fix-it efforts is misguided.” This study adds hard evidence that shutting down low-quality schools could better serve students’ interests than endless (and fruitless) efforts to improve them.
Second, policymakers have to grapple with the mechanism of closing schools—whether they ought to shutter schools via top-down decisions or the marketplace. Interestingly, save for Ohio’s automatic closure law that was applied to a handful of charters, state policy did not directly shutter the schools in this study. Rather, population loss and the proliferation of school choice forced districts to close unneeded schools, while most charters closed due to stagnant enrollment, financial difficulties, or a combination of both.
In other words, Ohio’s experience with urban school closures was primarily market-driven. Families voted with their feet, and weaker schools withered and eventually died. And it worked. Most students—though not all—landed in higher-quality schools and made gains after closure. Could Ohio have done even better for its students, had school authorities closed schools more aggressively and strategically? Perhaps.
Though fraught with controversy and political peril, shuttering bad schools might just be a saving grace for students who need the best education they can get.