When I get a call from a reporter on a Friday, it typically means that a government agency is trying to dump bad news. When I get a call from a reporter on the Friday before Thanksgiving week, I know that a government agency is trying to dump really bad news.
The feds spent several BILLION dollars and got terribly disappointing results—but, tragically, the results are predictable to anyone familiar with the history of “turnarounds.”
And so it is with the U.S. Department of Education’s quiet release of results from the first year of the massive School Improvement Grant (SIG) program. (See Alyson Klein’s Ed Week coverage.)
The headline is simple: The feds spent several BILLION dollars and got terribly disappointing results—but, tragically, the results are predictable to anyone familiar with the history of “turnarounds.”
Almost three years ago, in an article for Education Next called “The Turnaround Fallacy”, I detailed how and why previous turnaround efforts failed so consistently and predicted that future efforts would amount to the same. Chapter 4 of my new book, The Urban School System of the Future, extends that argument with even more evidence.
It’s not just me. Tom Loveless’s 2009 Brown Center Report showed the dramatic failure of turnaround efforts over 20 years, and David Stuit’s remarkable and devastating 2010 study powerfully reinforced these findings.
Now the Department, doing its job, is trying to paint the new data as a good-news story. But that clearly belies the data. No amount of lipstick-gussying can change the facts.
The first thing to notice is how the results are explained. We are NOT told which, if any, results are statistically significant. We are NOT told what the average pre-intervention score was or what the average post-intervention score is.
Instead, the Department reports that a small fraction of schools made “double-digit” gains. This is the best framing the Department can come up with after weeks, and probably months, of data massaging.
The category of greatest success that the administration can cobble together is 25 percent of schools making “double-digit” gains in math and 15 percent making “double digit” gains in reading.
On its face, this is discouraging given the BILLIONS that have been spent on this program over the last four years. But consider: “Double-digit” gains would encompass all of the schools progressing from 10 percent proficiency to 20 percent proficiency.
A turnaround that ain’t.
So unless there’s data showing otherwise, we have to believe that some number of schools in the already-small class of “greatest success” aren’t really successes at all. They are schools that went from really, really, really low-performing to really, really low-performing.
Next, a preponderance of schools (40 percent in math and 49 percent in reading) saw “single-digit” gains. Again, this could include schools that went from 10 percent proficiency to 11 percent proficiency—yes, a single-digit gain, but light years from a turnaround.
So when we’re told “two-thirds” of participating schools “made gains,” be aware we’re talking about schools like these. Are they worth BILLIONS?
And that, of course, leaves the third of schools that actually went backward—schools that got millions of taxpayer dollars each and not only failed to dramatically improve but actually regressed.
Having studied decades’ worth of “turnaround” attempts, which—when considered in combination—can be simply summarized as “consistent failures,” I can say that this episode is eerily familiar…but even more discouraging.
As always, there were big promises on the front end. As always, the results came up far short of hopes and expectations. As always, those responsible are trying to spin the story to make the effort appear as positive as possible. And as always, there will be a solemn plea for more time and money to allow this effort to succeed.
This all follows a decades-old script.
But there are two subplots in this chapter of this long-running tragedy that are especially frustrating. The first relates to the surprisingly meager positive results.
Historically, schools subject to “turnaround” attempts are so low-performing that improvement efforts often see early gains. These schools are in such dire straits that initial quick-win efforts like instituting a school-wide curriculum or bringing a modicum of order to classrooms will bring about a bump in performance. The problem in the past has been sustaining and building on the gains made in year one.
I can’t recall a study of previous turnarounds that showed so many schools falling farther behind after interventions. This is unprecedented—in a bad way.
With SIG, given all of the money and attention provided to participating schools, I expected virtually all of them to make nontrivial gains early. I was waiting to see what happened in years two, three, and four.
But, according to the data provided, we’re seeing miniscule early gains. If history is a guide, the next few years are going to be a struggle to hang on to this minor improvement.
The second concern is, of course, those schools that regressed. I can’t recall a study of previous turnarounds that showed so many schools falling farther behind after interventions. To my recollection, this is unprecedented—in a bad way.
So how in the world did this happen?
Without seeing more data, this is the best top-line explanation I can come up with: Unlike previous turnaround efforts that generally engaged a wider array of struggling schools, SIG was targeted at the very, very lowest performing. These are the nation’s most dysfunctional schools embedded in the nation’s most dysfunctional districts.
So we sent BILLIONS of dollars into deeply troubled districts, which then funneled the money into deeply troubled schools. And according to this eye-opening CRPE study, the interventions were often of suspect seriousness and vigor. No wonder the SIG results are even more disappointing than those generated by decades of previous turnaround attempts.
Now we face a fork in the road.
We can do what we’ve done for decades. That would mean allowing this story to get buried or, despite the evidence, hoping that SIG results will improve if we only give the program more money and time. Then, in a decade or so, some other contrarian blogger can add SIG to the long list of failed turnaround efforts.
Or we can finally recognize that we’re dealing with a much bigger problem. We can accept that “turnaround” efforts are not a path to ensuring low-income urban kids get a great education; that dysfunctional schools are a function of dysfunctional districts; that we need to close these schools, open new schools, and allow great schools to replicate and expand.
In other words, we need a new approach to the ongoing failure of our city school systems—one that stops behaving as though the broken schools of yesterday need to be the schools of tomorrow, one that stops jamming scarce resources into dysfunctional systems that remain impervious to reform and improvement.
Said another way: The traditional urban school system is broken. It cannot be fixed. It must be replaced.