Robert Pondiscio, a vice president at the Core Knowledge Foundation and editor of its blog, posed an interesting question on Twitter this week:
I’ve seen bad schools with good test scores before. Any good schools with bad test scores?
It’s a timely and important question that gets to the heart of the emerging debate over whether standardized tests can fairly and accurately measure student learning, and whether accountability systems based on their results are too often mislabeling successful teachers and schools as “failures.”
Obviously, no accountability system is perfect, but we can all agree that one that gets it wrong as often as it gets it right is in need of serious reform. But is there any proof that is happening?
No accountability system is perfect, but we can all agree that one that gets it wrong as often as it gets it right is in need of serious reform.
Enter Kristina Rizga, a Berkeley-educated muckraking journalist who recently took the reins as the education reporter at Mother Jones after stints at Wiretap Magazine and AlterNet. In preparation for her new article, “Everything You’ve Heard About Failing Schools Is Wrong,” Rizga spent a year “embedded” in Mission High School in San Francisco. Her goal was to seek a “grassroots view of America’s latest run at school reform,” with an eye towards how we know “when schools are failing,” and whether “the close to $4.4 billion spent on testing since 2002…[is] getting results.” The culmination of her work at Mission High is the story of a “good” school that is being wrongly—potentially damagingly—labeled as “bad.”
The first-person account of her time at Mission High School is woven together with warnings from leaders like Diane Ravitch who lament that “accountability turned into a nightmare for American schools, producing graduates who were drilled regularly on the basic skills but were often ignorant about almost everything else.” Rizga also warns readers that
the push to improve scores has left behind traditional assessments that, research indicates, work better to gauge performance: classroom work and homework, teachers’ grades and quizzes, the opinions of students and parents about a school.
Of course, if Rizga’s goal is to show that a “good” school was being inappropriately misjudged by a flawed accountability system, then Mission High might not have been the best example. On the one hand, Rizga is right that it did land on California’s list of “persistently failing schools” despite being, by all accounts, a safe, caring environment, full of hard-working and dedicated teachers.
On the other hand, Mission High’s recent success might better be understood as the hopeful story of how the teachers, leaders, and community have banded together to help a previously struggling school begin to chart a new course.
Indeed, in one of Rizga’s earlier articles, from December 2010, she noted that it was in 2009 when “things started to change in dramatic, visible ways. Dropout rates fell from 32 percent to 8 percent in one year. Test scores shot up. College acceptance rates grew.” In fact, in the 2009-2010 school year, Mission “made the largest gain for a San Francisco High School in the annual Academic Performance Index (API) report, a record that measures school performance based on a combination of state standardized test scores.”
Despite these promising improvements, Rizga was herself surprised by what she found at Mission:
Judging from what I’d read about ‘troubled’ schools, I’d expected noisy classrooms, hallway fights, and disgruntled staff. Instead I found a welcoming place that many students and staff called ‘family.’ After a few weeks of talking to students, I failed to find a single one who didn’t like the school, and most of the parents I met were happy too. Mission’s student and parent satisfaction surveys rank among the highest in San Francisco.
Of course, Rizga is right to shine a spotlight on the hard work educators serving in our nation’s toughest neighborhoods are doing to improve the opportunities for the students in their charge. But this hard work doesn’t mean that the idea of a “troubled” or even a “persistently low-performing” school is an illusion, as too many parents and students in many of our nation’s lowest-performing schools can attest.
What’s more, before we use stories like Mission’s to undermine an accountability system that, frankly, might have helped spur some of Mission High’s recent success, there are two things worth considering.
First, in California, the state takes into consideration achievement results from the previous three years to identify its list of “persistently failing schools.” Assuming the dramatic gains that began in 2009 continue, Mission will not languish long on that list. Perhaps it would be better if such a lag didn’t exist, but in order to ensure that student achievement gains are the result of lasting curricular, instructional, or leadership changes—or to ensure that achievement dips are due to something other than fleeting demographic, curricular, or leadership changes—states are wise to look at more than a single year of achievement data. Will Rizga rethink her conclusions if Mission High comes off the “persistently failing schools” list in 2013?
Second, despite the improvements Mission has seen, the low test scores that continue to plague the school say something significant that shouldn’t be overlooked.
In her article, Rizga focuses on the experience of one student in particular. Maria is an immigrant who came to the United States in middle school, unable to speak English, and who was all but written off by her teachers and her community. That is, until she came to Mission. At Mission High, Maria met a team of caring, engaged, and dedicated teachers who helped her with her work and opened a world of opportunity to her.
Yet, despite Maria’s hard work in the classroom, and despite the good grades she received from her Mission teachers, she still struggled mightily on the California state summative assessments. In tenth grade, despite earning As and Bs in modern world history, Maria scored at the lowest level on the state test.
Rizga hints at why she thinks Maria performed far worse on the state assessments than she did on her classwork and teacher-created assessments. She was in the classroom when Maria and her class took a practice exam in class preparation for the state test. It had two parts: a multiple-choice test designed to mimic the state exam and an essay question created by her teacher. By the end of the first section:
Maria had spent too much time on the first five questions and now she had to rush. She translated another page and randomly bubbled in the rest.
When she switched to the written section of the test, her leg stopped bouncing. When the bell rang, Maria kept writing, and didn’t stop until Roth collected the pages from her.
Roth waiting until the last student had left the room, and we looked over Maria’s test together. She got almost all the answers wrong on the practice multiple-choice section, the only one that would have counted for the state. On Roth’s essay question, she got an A+.
Unfortunately Maria isn’t alone. On the end-of-year tenth grade history exam, just one in five of her fellow Latino students—19 percent—scored proficient or better. And, according to the state test results posted on Greatschools.net, the results for other subgroups—African Americans, special education students, and on—are equally poor.
There is abundant research that suggests that student GPA and teacher feedback is often biased.
This enormous gap between student performance on classroom assessments and their results on statewide assessments is important, and it says something about the struggles that Mission High continues to face. Yes, the Mission faculty have made great strides in some critical areas—attendance, graduation rate, college acceptance, etc. But before we pronounce state tests worthless because they don’t take into consideration these other factors, it is worth exploring the expectations to which Maria and her classmates are being held in their classrooms and whether these test are revealing real gaps in their preparation for the world that awaits them beyond their high school’s walls.
There is abundant research that suggests that student GPA and teacher feedback is often biased, largely because the expectations teachers have for students—or groups of students—are different. In one recent study, researchers found that teachers were less likely to provide critical feedback on student work if they thought the student writing the essay was Black or Latino. Indeed, GPA itself is often a misleading indicator, with some evidence that A-level work in high-poverty schools being the substantively equivalent to B- or to C-level work in low-poverty schools.
Of course, it’s exactly this kind of unconscious bias that contributes to the achievement gap, even in schools that might seem to be getting so much else right.
So, perhaps Rizga is right that much of what we’ve heard about failing schools is wrong. But that’s not because the idea of struggling or “failing” schools is an illusion; rather it’s because the struggles run deeper and are more vexing than they first seem.