Recently my daughter asked me to describe my job. I had to think for a minute. Statistical interpreter? Results translator? None worked. I needed an elevator pitch, one my quasi-curious teen would understand easily, without too much side eye.
“Every two years, I tell people about how the nation’s school systems are educating America’s students,” I told her. She looked almost impressed, and I congratulated myself for omitting any mention of statistical significance or achievement levels.
Even better, I was able to say that another one of those every-two-years is right around the corner.
We’re preparing for the next release of results right now. Book your calendars for early 2025. In the meantime, I’ll share what I’ll look for in the results and what I’ll avoid when I talk about them. Namely, misNAEPery—a clever term to describe how the results can be misused and misinterpreted. Kudos to Morgan Polikoff for explaining generic variants of misNAEPery. My purpose here is to share specific pitfalls to avoid with this upcoming release.
As I see it, yes. NAEP is not a Magic 8-Ball for policy support or failure. However, the NAEP Reading results land, they will not serve as a referendum on whether, for example, the science of reading is real or on whether it works. States’ scores in 2024 will invariably differ from 2022 and from 2019, maybe in small ways, maybe in big ways. Regardless, that cannot be taken as proof or disproof of the science of reading.
BPE (Before Pandemic Era). It’s easy to attribute any score decline since 2019 to Covid. Scores fall; blame it on the pandemic. But we saw score declines on NAEP prior to pandemic-related school disruptions. Grade 8 NAEP Reading scores began trending downward in 2015. Think beyond the pandemic.
Stop this ride, please. NAEP assessments from 2024 should not dictate or derail any policy implementation in 2025. Trends require more than one or two data points. NAEP can’t be used to evaluate any given policy. Policy implementation takes time, persistence, consistent investment, and patience. A data point from 2024 should not signal a need to stop, drop, and run, nor a need to throw it all on one specific intervention. See science of reading issue above.
No to the nyah nyah. If one state’s average scores go up, and another state’s goes down, this does not justify “I told you so!” claims. What shapes a state’s or district’s average derives from myriad factors. And what rises now may fall in the future. Look to the scores to see who’s doing well. (Massachusetts inspired changes in education in the early 2000s; Mississippi most recently.) For sure, this is a smart way to identify and share promising practices. No gloating, just learning. Knowing that your experience may vary.
Proficient but not proficient. Remember: NAEP achievement levels indicate what percentage of students show “competency in challenging subject matter.” We call that NAEP Proficient (note the italics!) to distinguish that from what states call proficient on their state assessments. They’re not the same thing, despite using the same word. Our definition may seem vague, but we try to clarify that here. Different assessments, different purposes. Comparing proficiency cut-scores across states can be done, but only if you’re the state mapping study.
Be real. NAEP scores will differ from previous years’ scores, or maybe they won’t. That’s the nature of assessment. But will we suddenly see that 50 percent of students are NAEP Proficient in math? Probably not. That’s not a spoiler; that’s just realistic. And if results show less than 50 percent at NAEP Proficient, does that mean the education system should be nixed? No. The results just outline the scope of the challenge and underscore the P in NAEP: not performance, but progress.
So far, I’ve focused on what not to do, which is a bummer. Let me pivot to the positive.
No shortcuts. Progress comes from hard work, not miracles, silver bullets, or panaceas. Scores show us broad patterns; we began thinking about issues with reading instruction because of that initial downturn in the average scores nearly ten years ago. NAEP doesn’t point to any easy solution. It tells the story.
Doing their part. It bears repeating: Progress isn’t easy. School administrators, staff, teachers, and students are working hard every day. They know that, and we appreciate that. Nothing in any assessment score refutes that. Period.
A little competition. I warned about not gloating if your state or district appears great on the next report card. But I didn’t say not to engage in some healthy competition. Maryland State Superintendent Carey Wright, formerly state chief in Mississippi, galvanized efforts to improve education in the Magnolia State by refusing to allow the “at least we’re not Mississippi” adage to stand anymore when NAEP scores came out. She appears determined to make progress from her new perch, too, and will no doubt be eyeing how Maryland’s neighbor states are faring.
NAEP as backbone. Very savvy researchers use NAEP data as the backbone for pragmatic tools to understand how states and districts are helping their students progress academically. See, for example, the Education Recovery Scorecard and the New York Times’s school district tool.
Do your part. Prioritize school attendance. Read to your kids at home—even if they’re tweens and teens. Who doesn’t love a good story read to them? Check your audiobook history for proof of that. Encourage your kids to read books at home or wherever whenever.
One more don’t: Don’t forget. NAEP is due out in early 2025. Stay tuned.