Those who pay attention to the “Nation’s Report Card” tend to take it for granted. In truth, most people heed it not at all. (I sometimes call it “the most important test you’ve never heard of.”) Because it’s a low-stakes operation that yields no data for individual students or schools and just a handful of big districts, the National Assessment of Educational Progress (NAEP) is easily ignored. And because it’s a federal program that’s been around for half a century, it’s sort of boring, even for NAEP-watchers. It seems to do the same thing over and over, and every couple of years it dutifully reports depressing results for states and nation. There may be a headline or two—more likely an article on page 7—and then it again recedes from view.
Well, brace yourself. Changes may be coming. I’d like to think they’ll result from the wise and insightful recommendations in my forthcoming book, but in fact the National Academies are taking the lead, at least temporarily, and NAEP’s own minders are nudging the future themselves.
This very day, March 24, 2022, the National Academies Press is releasing A Pragmatic Future for NAEP: Containing Costs and Updating Technologies. This eleven-part report, concluding with “A New Path for NAEP,” was commissioned by the Education Department’s Institute for Education Sciences (IES), which asked the Academies for “an expert panel to recommend innovations to improve the cost-effectiveness of NAEP while maintaining or improving its technical quality and the information it provides.” Headed by Karen J. Mitchell and containing a number of heavy hitters in the realm of testing and measurement, this eleven-member group took its charge seriously, and its twenty-one recommendations, if taken equally seriously, would result in big changes for NAEP, including considerable cost savings.
The Nation’s Report Card has grown awfully expensive. The panel pegs its total annual cost to taxpayers at $175 million, which may not sound like a lot in an era of trillion-dollar proposals (and deficits), but which works out to an estimated $438 per test-taker. NAEP, says the panel, is more expensive than PISA, far more expensive (per student) than state testing programs, and several times pricier (per test-taker) than high-stakes exams such as the SAT and GRE, though those tests are typically far more extensive.
The panel was frustrated by the difficulty of obtaining accurate data on the costs of NAEP’s many moving parts, and its first recommendation is that the two entities responsible for the Nation’s Report Card—the National Center for Education Statistics (NCES) and the National Assessment Governing Board (NAGB)—should “develop clear, consistent, and complete descriptions of current spending on the major components of NAEP,” and that these be used going forward “to inform major decisions about the program to ensure that their long-term budgetary impact is supportable.” Indeed, one panel recommendation, stemming from the seeming opacity of NAEP’s budget, is that a full-fledged audit be undertaken.
But containing costs is just the beginning. The panel would also change how achievement trends are monitored and reported; would integrate (and slightly lengthen) assessments so that a student might, for instance, take a combined test of history, civics, and geography (or reading and writing); would modernize how test items are structured and created and how tests are scored (including much heavier use of technology); would alter test framework development and test administration in major ways; and would develop a “next-generation technology platform” for the entire venture.
The report bristles with “should dos” for NCES and NAGB, and it’s unknowable whether they’ll have the appetite and horsepower to undertake all these assignments. But a thoughtful article last week by NCES commissioner Peggy Carr and NAGB staff director Lesley Muldoon sketched some changes they’re already making in the assessment, and NAGB has also signaled its intent to convert NAEP to remote administration. But nothing now underway comes close to the overhaul framed by the Academies’ panel.
Yet wide-ranging and far-reaching as that group went, it avoided some key issues facing NAEP. (See my book!) Trying to sidestep political hot potatoes, it did not, for example, address the fact that much of NAEP’s high cost is due to Congress’s post-NCLB mandate to test reading and math at the national and state levels every two years, never mind that this interval is too short to reveal major changes in achievement. Nor did it go near what I view as NAEP’s single greatest current failing, namely its lack of state-level achievement data at the end of high school. (The Congressional mandate extends only to grades four and eight.) Others, including my Fordham colleague Mike Petrilli, think NAEP should commence in kindergarten.
There’s more. After reading the report, another veteran NAEP-watcher expressed disappointment “that the panelists didn’t address the organizational and process constraints on carrying out many of their recommendations. For example, how will the contracting process need to be modified if NCES is going to have more qualified providers? On the NAGB side, what should be done to rein in processes that are now so heavily laden with advisory panels, partnerships, consultations, and discussions that changing anything of substance is at minimum an eight-year task?”
The risk, as always, with reports like this is that they sit on a shelf and nobody does anything. In NAEP’s case, this risk is compounded by the complicated—often collegial but sometimes rivalrous—relationship between NCES and NAGB, as well as the fact that the NAEP budget is set far away from both, the fact that contracting is handled by a different unit of the Education Department, and the fact that Congress both micromanages NAEP and neglects it. (The main NAEP statute hasn’t been touched for decades.) The schisms and divisiveness that plague Capitol Hill have also begun to seep into NAGB itself, as was visible in last year’s fracas over the new NAEP reading framework, and may recur as the Board tackles the next science framework. Note, though, that culture wars over frameworks—and risks to the NAEP trendline—would ease if NAGB follows the panel’s recommendation to make its framework updates “smaller and more frequent.”
NAEP has become the country’s most important and respected gauge of student achievement, of changes over time in that achievement, and of gaps in that achievement. It’s an essential tool for pursuing both excellence and equity in American K–12 education. Its achievement levels are the closest the U.S. has ever come to national education standards. It could fairly be termed indispensable.
Yet it’s also costly, creaky, sluggish, and in many respects, archaic. The National Academies’ panel has gone a fair distance in pointing toward a nimbler, more efficient, and more productive assessment. I wish it had gone farther. But now the big question is how successfully and willingly NAEP’s minders will set about to change the stale bathwater without harming their cherished baby.