A dozen long years ago, when people were just beginning to take serious stock of what good and not-so-good was emerging from 2002’s enactment of No Child Left Behind (NCLB), we at Fordham, in league with the Northwest Evaluation Association (NWEA), issued a 200-plus page analysis of the “proficiency” standards that states had by then been required to set and test for. Titled The Proficiency Illusion, it reached a series of “sobering, indeed alarming” conclusions about where states were setting their proficiency bars in reading and math for purposes of “passing” their state assessments in the mid 2000’s. As Mike Petrilli and I wrote in the foreword:
We see…that “proficiency” varies wildly from state to state, with “passing scores” ranging from the 6th percentile to the 77th. We show that, over the past few years, twice as many states have seen their tests become easier in at least two grades as have seen their tests become more difficult….And we learn that only a handful of states peg proficiency expectations consistently across the grades, with the vast majority setting thousands of little Susies up to fail by middle school by aiming precipitously low in elementary school.
Others undertook kindred studies around the same time and reached similar conclusions. Writing in Education Next, also in 2007, this time using National Assessment (rather than NWEA) to benchmark and compare state standards, Paul Peterson and Frederick Hess found wide disparities in state proficiency expectations. They found three jurisdictions with “world class” cut scores, but went on to report that:
The remaining forty-seven states…had distinctly lower standards. Three states—Georgia, Oklahoma, and Tennessee—expected so little of students that they received the grade of F. The state of Georgia, for instance, declared 88 percent of 8th graders proficient in reading, even though just 26 percent scored at or above the proficiency level on the NAEP. According to our calculations, Georgia eighth-grade reading standards are 4.0 standard deviations below those in South Carolina, an extraordinarily large difference. Thus, while students in Georgia and South Carolina perform at similar levels on the NAEP, the casual observer would be misled by Georgia’s reporting that its students achieve proficiency at three times the rate that South Carolina’s students do.
Twelve states—Alabama, Alaska, Idaho, Illinois, Michigan, Mississippi, Nebraska, North Carolina, Texas, Utah, Virginia, and West Virginia—received Ds because they had pitched their expectations far below other states. Illinois set its proficiency bar for eighth-grade reading at a level that is 1.01 standard deviations below the national average. If you believe those who set the Illinois standards, 82 percent of its eighth graders are proficient in reading, even though the NAEP says only 30 percent are.
Also in 2007, the federal government’s National Center for Education Statistics (NCES), which is responsible for NAEP, came out with its own analysis of how state proficiency expectations compared with “proficiency” as defined by the National Assessment Governing Board. (This was based on state norms as of 2005.) Here, once again, we learned both of huge discrepancies from state to state and of a situation wherein the vast majority of states expected far less of their students by way of skills and knowledge in math and ELA than was deemed proficient on the National Assessment. This report’s prose was less colorful than the think tankers’, but it said essentially the same thing, with an important additional wrinkle that you will find in the last eleven words of this quote:
There is a strong negative correlation between the proportions of students meeting the states’ proficiency standards and the NAEP score equivalents to those standards, suggesting that the observed heterogeneity in states’ reported percents proficient can be largely attributed to differences in the stringency of their standards. There is, at best, a weak relationship between the NAEP score equivalents for the state proficiency standard and the states’ average scores on NAEP. Finally, most of the NAEP score equivalents fall below the cut-point corresponding to the NAEP Proficient standard, and many fall below the cut-point corresponding to the NAEP Basic standard.
All that is by way of context for the new report from NCES, which comes twelve years into the present and again maps state proficiency standards onto the NAEP scales, but this time does so using states’ assessment results—and NAEP results—from 2017. Better still, it also looks backward to previous such mapping exercise to see what’s changed.
The good news, as stated by veteran NCES associate commissioner Peggy Carr during a press briefing, is that “States that were identified as having lower standards increased their expectations for students over the previous decade.” She also noted, I think rightly, that the sunlight cast upon past state proficiency norms was causing weak performers to “second-guess” themselves. “Most of what we are seeing is the states at the bottom of our distribution of standards are saying, ‘Well, they should be a little more rigorous.’ That’s a function of seeing themselves in the context of other states.”
Success has many parents, and we at Fordham include ourselves among those who will take—and deserve—some credit for nudging (and perhaps shaming) a lot of states to expect more of their students. Perhaps they’ve also been encouraged by ESSA devolving more responsibility upon them, so that it’s no longer a contest to prove to Uncle Sam that all one’s students were headed to proficiency by an arbitrary date or else the roof will fall, as was true under NCLB, and more like “You still need to report how your students are doing but now it’s your problem to own and solve.”
But before we strain our shoulders patting ourselves—or the states or NCES or anybody else—on the back, let us recognize how limited is the actual “success” reported in this analysis.
Start by observing that Ms. Carr focused much of her commentary on how many states no longer had proficiency norms set at (or even below) what NAEP defines as basic rather than proficient. Bear in mind that the three NAEP achievement levels are “basic,” “proficient,” and “advanced,” with proficient defined by the Governing Board as:
Solid academic performance for each grade assessed. Students reaching this level have demonstrated competency over challenging subject matter, including subject-matter knowledge, application of such knowledge to real-world situations, and analytical skills appropriate to the subject matter. Thus, NAEP Proficient represents the goal for what all students should know.
Yet when we scrutinize the new NCES report we find, for example, that “In grade four reading, forty-seven of the fifty states included in the study had standards at or above the NAEP Basic level. Two states—Utah and Massachusetts—had standards at the NAEP Proficient level, while three states—Texas, Iowa, and Virginia—had standards below the NAEP Basic level.”
The picture in math is brighter. In eighth grade, for example, “all of the thirty-two states included in the study had standards at or above the NAEP Basic level. Seven states…had standards at the NAEP Proficient level.”
It’s worth noting that loftier state expectations in math might have something to do with the fact that American youngsters over the past couple of decades have made greater gains in math than in reading in the early grades. (Don’t even get me started on how little is known by anyone about high school expectations and how those compare from state to state.) Still, let’s keep the findings of this new report in perspective. Yes, it’s a fact that many states expect more of their students today than they did a dozen years earlier. That is indeed progress—and a fine thing, as far as it goes. Yet few states have matched (much less surpassed) NAEP’s proficient level in either math or reading at either grade four or eight. Those that have done so definitely deserve plaudits. But most states are being compared here with NAEP Basic—and basic just isn’t good enough!
One more thing. Setting the bar higher doesn’t mean that more kids are clearing it—or that the students in your state are actually learning more. It simply means you’re expecting more. Kansas, for example, now has the highest cut score in the land for eighth grade reading, when compared with NAEP achievement levels. But its eighth graders were reading no better in 2017 than in 2007—or 1968. This means that Kansas is at least being honest about how its students are doing—but how they’re doing (37 percent proficient or above in eighth grade reading in 2017) shouldn’t satisfy the parents and other taxpayers of the Sunflower State, especially since it seems not to be improving.
Progress on standards is a fine thing. But today they’re still low in the great majority of states—and so is student achievement. We’re now thirty-six years from 1983, but the nation is still at risk.