The dire findings on the performance of Ohio’s charter schools published by Stanford University’s Center for Research on Education Outcomes (CREDO) have provided the badly needed political impetus to reform the state’s charter school laws. Now, however, it appears that not only are these reforms at risk, but lawmakers are actually considering steps to weaken one of the few aspects of the existing accountability system that works.
If existing measures show that charter schools are underperforming, it seems that some charter operators have decided that it would be easier to change the yardstick used to assess them than to improve student achievement.
As the Columbus Dispatch reported recently, at least one charter school operator is pushing Ohio lawmakers to replace the state’s current “value-added” accountability framework with a “Similar Students Measure” (SSM), similar to metrics used in California. Doing so would be a gigantic step back in accountability and would make charter school student achievement look better than it really is.
Here is some background: The state of California developed such a measure back in the 1990s, before the statistical models used to estimate value added were widely available. The state and various other groups have continued to use this method because, unlike Ohio, California still has no way to link test scores to students over multiple years, which is necessary to estimate value-added models.
Think of SSM as a poor man’s value added—indeed, the California Charter Schools Association calls it a “proxy value-add”—but with a huge drawback. SSM matches schools to each other based on observable student characteristics, such as the percentage of students receiving free and reduced-price lunch and their racial composition. However, it does not account for unobservable or immeasurable factors like parental support, student motivation, or natural scholastic ability.
This is a huge problem, as there is strong evidence that the kinds of students who choose to leave public schools for charters are systematically different from students who do not. Many of these differences are unobservable even with student-level data, and some of them mark charter school students as higher-achieving to begin with (even when they are still back at their public schools). Fortunately, the value-added methodology accounts for these differences; the SSM measure would not.
To see the limits of SSM and the advantages of value-added, consider one example: UC San Diego’s Preuss School. The charter school specifically targets low-income minority students. Indeed, all of its pupils would be the first in their families to go to college. When compared to other schools serving similar student populations, Preuss would seem to be a huge over-performer. Nearly all of its graduates continue on to higher education, and Newsweek has named it the top transformative school in the country for three years in a row.
As researchers showed a few years ago, however, the reality seems to be far more nuanced. Because of the school’s popularity, it regularly attracts more applicants than it has space to admit, and must use a random lottery to decide which students get accepted. Thanks to this lottery, we have a natural “control group”—children who applied to attend Preuss but lost the lottery—against which to compare its achievement. And when researchers did this, they found few significant differences in achievement between those who attend the school and those who apply but are turned away.
How could this be? It seems to be that the school’s application process (students must complete essays and solicit letters of recommendation, and their parents must agree to at least fifteen hours of volunteering at the school) weeds out all but the most motivated students with plenty of support at home, precisely the kinds of students who are likely to do well academically in almost any school setting.
To be sure, the lottery research also finds that there are real advantages to attending the Preuss School. Its students are more likely to complete classes necessary for college admission and to pursue higher education at higher rates. But few of these benefits seem to be driven by an improvement in actual student learning or achievement in the classroom. This important reality is obscured using the SSM metric, but would show up clearly with value added.
Adding the SSM metric to Ohio’s state report cards would represent an improvement for grades and subjects where value-added measures don’t exist, or for other outcome measures, such as graduation and attendance rates. But it would be wrong to use SSM to simply replace value added for accountability purposes.
Of course, it should be obvious why some charter operators would prefer that the SSM measure be used instead. But it is disappointing that policymakers are considering going along with this. It would be like abandoning high-speed, fiber-optic Internet to go back to the days of dial-up. Let’s not go backward.
Vladimir Kogan is an assistant professor of political science at the Ohio State University. He received his Ph.D. from the University of California, San Diego.