The Education Gadfly Weekly: The end of MCAS is the end of an era. Now let’s figure out what comes next.
The Education Gadfly Weekly: The end of MCAS is the end of an era. Now let’s figure out what comes next.
The end of MCAS is the end of an era. Now let’s figure out what comes next.
With the number of states requiring students to pass exams in order to earn a diploma now down to the single digits, this feels like the end of an era. What should we do now? Let’s start by getting the gang back together—a bipartisan group of governors and state education chiefs—to work on a rational set of high school graduation requirements reflecting the multiple pathways to upward mobility and post-secondary success.
The end of MCAS is the end of an era. Now let’s figure out what comes next.
How to interpret—and not misinterpret—forthcoming NAEP results
Knowledge-rich curriculum and direct instruction depend upon each other
PISA is wrong about China
In which states do students spend the most and least time in school?
#948: School choice setbacks: Interpreting the referenda losses with Colleen Hroncich
Cheers and Jeers: December 5, 2024
What we're reading this week: December 5, 2024
How to interpret—and not misinterpret—forthcoming NAEP results
Knowledge-rich curriculum and direct instruction depend upon each other
PISA is wrong about China
In which states do students spend the most and least time in school?
#948: School choice setbacks: Interpreting the referenda losses with Colleen Hroncich
Cheers and Jeers: December 5, 2024
What we're reading this week: December 5, 2024
The end of MCAS is the end of an era. Now let’s figure out what comes next.
As expected, in last month’s election, voters in Massachusetts supported a union-backed ballot initiative to kill off the Bay State’s longstanding graduation exam. At around the same time, New York state officials released a timeline for eliminating the even-longer-standing Regents exams as a requirement for earning a high school diploma. With the number of states requiring students to pass exams in order to graduate now down to the single digits, this feels like the end of an era.
In that spirit, I recently dusted off a (digital) copy of Ready or Not : Creating a High School Diploma That Counts, published by Achieve, The Education Trust, and the Thomas B. Fordham Institute exactly twenty years ago. How far we’ve come—or fallen—since then! At the time, standards-based reformers worried about the eroding value of the high school diploma and wanted to make sure that all students graduated “college and career ready.” (This was meant to supplement the No Child Left Behind act, which brought much-needed accountability to America’s schools, but left student accountability on the cutting room floor.)
At the heart of the “American Diploma Project” (ADP) strategy was a set of standards in English language arts and math that strongly influenced the Common Core, which came in 2010, along with the expectation that states would ensure that students met these standards by requiring them to pass exams to prove their mettle.
That was not to be. Soon after the release of the ADP recommendations, Bush Administration officials issued regulations requiring states to incorporate graduation rates, measured in a common way, into their No Child Left Behind accountability systems. The law of unintended consequences kicked in, with the focus shifting from “beefing up the value of the diploma” to “getting everyone across the finish line.” Our national graduation rate went from 79 percent in 2010–11 to 87 percent in 2021–22.
Sometimes that stemmed from positive practices, such as the adoption of early detection systems for students who were off-track. But just as often it resulted from school districts embracing dubious credit recovery programs and other schemes to inflate their graduation rates.
And speaking of inflation, grade inflation in high schools only grew worse during this period, too, before going bananas during the Covid era.
It’s probably never been easier to graduate from high school in America than it is today—the opposite of what the ADP folks imagined.
Any semblance of standards has collapsed at the higher education level, too. The ADP report worried greatly about high school graduates ending up in remedial education once they got to college. The solution that higher ed came up with: Just get rid of the “remedial” courses and place students in credit-bearing, so-called “corequisite” courses instead. That worked for some of the higher performing of the formerly remedial students but was a disaster for the lowest performing ones, who should never have been admitted to college in the first place.
Desperate for students, colleges and universities more recently started embracing “direct admissions” policies whereby students are welcomed to campus without even filling out traditional applications. The impulse makes sense—let’s remove some of the red tape that keeps qualified young people, especially first-generation college students, from applying to and matriculating to college. But alarmingly, most of the time students are admitted on the basis of grades (or their derivatives, like GPA or class rank), rather than objective measures like test scores. Which is a real problem in an age of grade inflation! Then there’s dual enrollment, which is also filling seats (especially in community colleges) and offering students college credit, but without any of the quality control mechanisms that make programs such as Advanced Placement and International Baccalaureate so rigorous.
Where to go from here
On one level, none of this is surprising. Holding the line on standards is tough work for elected officials, who have not exactly been in an eat-your-broccoli mood lately. It’s all sugary cereal, all the time, as my friend Rick Hess wrote recently. And we’ve created incentives that encourage leaders in K–12 and higher ed to follow the path of least resistance. Superintendents and high school principals crave the positive accountability ratings that come from keeping students on the rolls, even if their engagement in school (or lack thereof) makes them look an awful lot like what we used to call dropouts. And college administrators are desperate to put butts in seats.
It’s also true that the previous paradigm—encapsulated nicely by the ADP—came awfully close to embracing a “college for all” mindset. The argument was that all kids needed the same reading, writing, and math skills in order to succeed in college or career, but that was interpreted as arguing that all students should do more or less the same thing all the way through twelfth grade. And in reality, that has meant almost everyone doing college-prep (if often a watered-down kind), with a dash of CTE sometimes thrown in on the side.
Now the pendulum has swung and policymakers, policy wonks, and the public are closer to agreement that high school students should get to spend more of their time on real career and technical training if that’s what they want. Yet our graduation requirements—especially our course requirements—haven’t yet changed to make this shift easily doable. States such as Indiana are trying to address this but remain mired in old debates about “college or career readiness” versus “college and career readiness.”
So here’s a modest proposal: Let’s get the gang back together again—I’m thinking of a bipartisan group of governors and state education chiefs—and work on figuring out a rational set of high school “pathways” policies going forward, with a particular focus on graduation requirements, à la ADP. My own hope is that they would aim to:
- Ensure that all students master core academic knowledge and skills by the end of the tenth grade. Those who demonstrate such mastery (ideally via end-of-course exams) can then choose among a set of high-quality pathways (more on those below). Those who don’t would keep working on it—but wouldn’t be denied high school diplomas either.
- Allow juniors and seniors who have passed the exams to choose from several pathways, including preparing for selective colleges and universities; preparing for nonselective colleges and universities; preparing for technical programs at the higher education level; or preparing to enter the world of work immediately, including the military.
- Address rampant grade inflation in high schools and beyond.
- Ensure that all postsecondary options (including technical and community colleges) set and enforce admissions standards.
- Work toward closing the funding gap between career and technical education (at the high school and postsecondary levels) and traditional higher education.
This would look a lot like where Maryland’s Kirwan Commission landed, though whether the Old Line State actually succeeds in putting such a system in place remains to be determined.
This still might be too much broccoli eating for today’s political environment. Cynics will say that governors and state education chiefs won’t want to expend political capital trying to fix America’s broken high schools and devalued high school diplomas. So, I say to governors and chiefs, prove them wrong!
How to interpret—and not misinterpret—forthcoming NAEP results
Recently my daughter asked me to describe my job. I had to think for a minute. Statistical interpreter? Results translator? None worked. I needed an elevator pitch, one my quasi-curious teen would understand easily, without too much side eye.
“Every two years, I tell people about how the nation’s school systems are educating America’s students,” I told her. She looked almost impressed, and I congratulated myself for omitting any mention of statistical significance or achievement levels.
Even better, I was able to say that another one of those every-two-years is right around the corner.
We’re preparing for the next release of results right now. Book your calendars for early 2025. In the meantime, I’ll share what I’ll look for in the results and what I’ll avoid when I talk about them. Namely, misNAEPery—a clever term to describe how the results can be misused and misinterpreted. Kudos to Morgan Polikoff for explaining generic variants of misNAEPery. My purpose here is to share specific pitfalls to avoid with this upcoming release.
As I see it, yes. NAEP is not a Magic 8-Ball for policy support or failure. However, the NAEP Reading results land, they will not serve as a referendum on whether, for example, the science of reading is real or on whether it works. States’ scores in 2024 will invariably differ from 2022 and from 2019, maybe in small ways, maybe in big ways. Regardless, that cannot be taken as proof or disproof of the science of reading.
BPE (Before Pandemic Era). It’s easy to attribute any score decline since 2019 to Covid. Scores fall; blame it on the pandemic. But we saw score declines on NAEP prior to pandemic-related school disruptions. Grade 8 NAEP Reading scores began trending downward in 2015. Think beyond the pandemic.
Stop this ride, please. NAEP assessments from 2024 should not dictate or derail any policy implementation in 2025. Trends require more than one or two data points. NAEP can’t be used to evaluate any given policy. Policy implementation takes time, persistence, consistent investment, and patience. A data point from 2024 should not signal a need to stop, drop, and run, nor a need to throw it all on one specific intervention. See science of reading issue above.
No to the nyah nyah. If one state’s average scores go up, and another state’s goes down, this does not justify “I told you so!” claims. What shapes a state’s or district’s average derives from myriad factors. And what rises now may fall in the future. Look to the scores to see who’s doing well. (Massachusetts inspired changes in education in the early 2000s; Mississippi most recently.) For sure, this is a smart way to identify and share promising practices. No gloating, just learning. Knowing that your experience may vary.
Proficient but not proficient. Remember: NAEP achievement levels indicate what percentage of students show “competency in challenging subject matter.” We call that NAEP Proficient (note the italics!) to distinguish that from what states call proficient on their state assessments. They’re not the same thing, despite using the same word. Our definition may seem vague, but we try to clarify that here. Different assessments, different purposes. Comparing proficiency cut-scores across states can be done, but only if you’re the state mapping study.
Be real. NAEP scores will differ from previous years’ scores, or maybe they won’t. That’s the nature of assessment. But will we suddenly see that 50 percent of students are NAEP Proficient in math? Probably not. That’s not a spoiler; that’s just realistic. And if results show less than 50 percent at NAEP Proficient, does that mean the education system should be nixed? No. The results just outline the scope of the challenge and underscore the P in NAEP: not performance, but progress.
So far, I’ve focused on what not to do, which is a bummer. Let me pivot to the positive.
No shortcuts. Progress comes from hard work, not miracles, silver bullets, or panaceas. Scores show us broad patterns; we began thinking about issues with reading instruction because of that initial downturn in the average scores nearly ten years ago. NAEP doesn’t point to any easy solution. It tells the story.
Doing their part. It bears repeating: Progress isn’t easy. School administrators, staff, teachers, and students are working hard every day. They know that, and we appreciate that. Nothing in any assessment score refutes that. Period.
A little competition. I warned about not gloating if your state or district appears great on the next report card. But I didn’t say not to engage in some healthy competition. Maryland State Superintendent Carey Wright, formerly state chief in Mississippi, galvanized efforts to improve education in the Magnolia State by refusing to allow the “at least we’re not Mississippi” adage to stand anymore when NAEP scores came out. She appears determined to make progress from her new perch, too, and will no doubt be eyeing how Maryland’s neighbor states are faring.
NAEP as backbone. Very savvy researchers use NAEP data as the backbone for pragmatic tools to understand how states and districts are helping their students progress academically. See, for example, the Education Recovery Scorecard and the New York Times’s school district tool.
Do your part. Prioritize school attendance. Read to your kids at home—even if they’re tweens and teens. Who doesn’t love a good story read to them? Check your audiobook history for proof of that. Encourage your kids to read books at home or wherever whenever.
One more don’t: Don’t forget. NAEP is due out in early 2025. Stay tuned.
Knowledge-rich curriculum and direct instruction depend upon each other
Some days, when you’re rocking a seven-day-old infant to sleep at three in the morning, the only thing to do is pick away at instruction manuals in a bleary-eyed daze. So it was that I found myself reading two excellent (if seemingly unrelated) books, Zach Groshell’s Just Tell Them and E.D. Hirsch’s Ratchet Effect over the past few weeks. While tackling discrete elements of education—the former instruction and the latter curriculum—their topics depend upon each other, like the two chemicals of an epoxy resin that remain inert until mixed.
To Fordhamites, Hirsch needs no introduction. He’s a veritable patron saint of education reform, whose 1987 book Cultural Literacy advanced a paradigm-shifting argument: Factual knowledge, not transferable skills, determine academic competence. As such, a core curriculum is perhaps the most important lever for school improvement. Since then, he’s written a number of best-selling books that advance and build upon that fundamental argument.
His most recent offering may not be his strongest, but that’s akin to lamenting Stanley Kubrick’s Spartacus as not his strongest film or that Lou Gehrig only hit a triple. It still outshines most education books.
Where much of the book retreads familiar ground about the importance of shared common knowledge to both individuals (allowing them to both read and think) and society more generally (binding us together through shared cultural touchpoints), this book’s unique contribution comes in the title: The Ratchet Effect.
A concept from developmental psychology, the ratchet effect references ratchet straps that cannot regress once advanced as an analogy to describe human progress. As a species, we discover new knowledge and processes, both simple and complex, ranging from how to build simple tools up to and exceeding nuclear fission, all of which we can then pass along to future generations. As such, year by year, century by century, humanity advances, whereas other species never advance, stuck with the same tools, never exceeding sticks for poking and stones for smashing.
This development and transmission of a cumulative cultural knowledge is unique to humanity. We ratchet forward, but monkeys, for example, slip back and must start from the beginning with every generation. Regarding education, this concept implies that schools are then responsible for handing along this body of learned knowledge, like an inherited stack of books.
Reading across Hirsch’s corpus, his various defenses of well-planned curricula remind me of a passage from a theologian discussing the multifarious benefits of civilization:
Thus, if one asked an ordinary intelligent man, on the spur of the moment, “Why do you prefer civilization to savagery?” he would look wildly round at object after object, and would only be able to answer vaguely, “Why, there is that bookcase...and the coals in the coal-scuttle...and pianos...and policemen.” The whole case for civilization is that the case for it is complex. It has done so many things.
As for preferring centralized curriculum to teacher preference or student direction, one looks around and can only answer “well, there’s the literacy and common culture…and patriotism…and coherent educational plans.” Hirsch’s latest emphasis on human advancement is but one more proof in defense of knowledge-rich curriculum. It does so many things.
Interspersed in his discussions of curriculum, Hirsch makes reference to the instructional practices that teachers ought to use to transmit this cultural inheritance—well-structured classrooms, clear explanations, and assessments, for example—but the argument for curriculum doesn’t point to one form of instruction necessarily. Once we’ve identified a body of knowledge, what’s the best method for passing along this information to students? Should they all read it independently in books? Discover it in contrived scenarios? Learn it through projects? Download it onto their brains through a neural implant?
Such questions bring me to my second book, Just Tell Them. Its author, Zach Gorshell, an instructional coach, argues that explicit instruction—featuring examples, thorough explanations, modeling, and other structured, teacher-led techniques—is the most effective means of transmitting our collective knowledge from teacher to students.
Perhaps understandably, other books have spent hefty word counts defending direct instruction over and above project-based, student-directed, or inquiry learning without delving into how it’s actually done well. Such polemics are worthwhile as a so many faddish instructional books prattle on about glitzy instructional techniques that teach students nothing. But when it comes to application, teachers have Doug Lemov’s Teach Like a Champion and little else.
Groshell fills that dearth. He does open his book with a brief case for direct instruction but spends the majority of his word count discussing how to do it well. For example, explanations should remain concise and avoid vague terms. Teachers must include both examples and non-examples. While images are beneficial, text overlays alongside diagrams or pictures can overload a student’s working memory. Students learn best when teachers toggle between explanation and practice problems instead of an information dump and extended work time.
At a concise ninety-eight pages, Groshell follows his own advice. It’s a useful primer for either parents and policymakers interested in the aspects of research-based instruction, novice teachers struggling in their first years, or instructional coaches who need a text to lean on.
Returning to my central contention: Knowledge rich curriculum and direct instruction depend upon each other. A curriculum cannot teach itself. Unfortunately, too many teachers either learn ineffective theories of instruction or dither away their timing debating esoteric theorems in their teacher prep. Conversely, teachers need actual content to fill their instruction. Without a thoughtfully-planned, sequenced curriculum, students may encounter dinosaurs in three successive grades without ever learning about cellular biology; meanwhile, teachers waste countless hours crafting handmade curriculum and searching for online materials instead of planning effective instructional delivery, adjusting pacing, contacting parents, providing feedback, and other high-yield activities.
Often, we overcomplicate education reform, wiling away hours and ink considering and debating funding formulas, the psychometric validity of this or that assessment, what precisely ought to be the accountability measures for failing schools, and other wonkish debates. Such debates have some value but not much connection to the classroom.
But these two books together remind us to focus on two simple levers to improve schools: curriculum and instruction. That is what schools are about after all. Get those right and much else will fall into place as a result.
PISA is wrong about China
AEI’s foremost and very distinguished demographer, my long-ago colleague Nick Eberstadt, joined by several colleagues, has released a devastating analysis/critique of the much-cited OECD assertion that China’s K–12 education performance—based on PISA scores—is better than that of any other country with the possible exception of Singapore.
Although OECD reports, if you read closely, generally make clear that PISA has been administered to students in just a few regions of China[1], not the whole country, that doesn’t stop them from depicting the results as “China’s” nor from statements like this, taken from an exceptionally bullish 2020 report on Chinese education that OECD prepared jointly with Chinese authorities:
In all four cycles of PISA, Chinese students from these jurisdictions have outperformed the majority of students from other education systems. Even though the participating Chinese jurisdictions do not represent China as a whole, they are still considerably larger than many OECD countries: Beijing, Shanghai, Jiangsu and Zhejiang together are home to over 183 million people, which is more than the combined population of France and Germany.
Note the disingenuousness of that formulation, as if the size of the selected subunits within China that were sampled and tested yielded data as valid as from other nations that happen to be smaller but where the entire population was sampled and tested.
As for how many Chinese fifteen-year-olds actually got tested and where they came from, the AEI paper drily says this:
[T]he PISA sample for China that year involved just 361 schools and twelve thousand students—about thirty-three students per school selected. We do not know the process by which those schools were selected, much less the identities of the schools themselves, or the protocols observed in conducting (and preparing for) the tests. Suffice it to say there would appear to be plenty of scope in this mysterious process for non-random results.
Here’s a bit more boasting from the OECD report itself:
Students in Beijing, Shanghai, Jiangsu and Zhejiang (B-S-J-Z) outperformed their peers in other high-performing countries in all three PISA domains (mathematics, science and reading) by a large margin.... The excellence of student cognitive outcomes in B-S-J-Z (China) can be attributed to teacher and school characteristics to a large degree.
Which leads to displays such as this—from the same report—which imply that it’s the entire country unless you look at the small-font note at the bottom:
When these results get picked up by others, such as the World Population Review, the fine print vanishes and—voila—we seem to be looking at results for China as a whole.
Eberstadt and colleagues undertook a heroic reanalysis that didn’t just take PISA results from China at face value and assume that they’re representative of China. In a project initially carried out for the Defense Department, they engaged in a series of adjustments based on other information about educational performance of students in different parts of China, about student aptitudes (insofar as those can be gleaned from sparse data), and from “qualitative” information about schools, schooling, and demographics (including family structure) in various parts of China, and much more.
Here’s the blunt conclusion of a very sophisticated paper:
We use both quantitative and qualitative evidence to survey the field and emphatically reject the PISA-based assessment that places China first in the world. Instead, we come away with the working hypothesis that the knowledge capital of China’s K–12 population is on par with that of Turkey.
OK, it’s not a conclusion. It’s a “working hypothesis,” and in their almost-sixty pages the authors pay considerable attention to alternative explanations, contrary semi-examples (e.g., the educational performance of Vietnam), and ways in which different assumptions and interpretations—including Eric Hanushek’s careful analyses—yield different results that could be right. But they give what are, to me, reasonably persuasive justifications for their own “hypothesis.”
That said, China, simply because of its mammoth scale, does produce very large numbers both of children with basic skills and of high-fliers. The percentages may not be impressive, but the absolute numbers are large and (if the sampling can be trusted within the regions where it’s done) the percentages are sometimes impressive, too. Among those tested in math in the four Chinese regions in 2018, for instance, some 44 percent scored at PISA’s highest levels (5 and 6), compared with, say, 8 percent in the U.S. and 17 percent in Switzerland. Even if the Chinese data come from a narrow swath of that country’s population, and even if the data are suspect, we can glimpse America’s major competitor in the modern world producing large numbers of individuals with high levels of skill.
Why is this sort of analysis so frustrating as well as worrisome? As the AEI paper explains, and as pretty much every serious independent analyst of Chinese education has noted, the closed, secretive, and thoroughly censored nature of the country itself means that data are scarce, incomplete, and often untrustworthy. So scholars work with what they can get, what they deem reliable or plausible, what they can infer, and what they can bring to bear from other sources.
That’s what Eberstadt and company have done in a wide-ranging examination (that, among other things, sheds much worrisome light on India, a far more open society that has an immense number of children lacking basic skills, even as it has an immense number of college graduates lacking decent jobs). It’s worth attention, in part just to understand the basis for mistrusting what emerges from OECD when the topic is China.
SOURCE: Nicholas Eberstadt, Patrick Norrick, Radek Sabatka, and Peter Van Ness, “Knowledge and Skills in China’s K-12 Population: An Inquiry into “Knowledge Capital” in the PRC,” American Enterprise Institute, Foreign & Defense Policy Working Paper 2024-07, November 2024.
[1] Even within the four regions—all relatively prosperous, relatively well educated, major metropolitan areas in eastern China—not all fifteen-year-olds were in the samples tested by PISA. Participants had to be legitimate residents of those regions and to attend schools chosen to take part in PISA. See this important analysis by Tom Loveless.
In which states do students spend the most and least time in school?
The amount of time students spend in school is a popular lever of change pulled by education policymakers of all types. Longer days or shorter days; four-day weeks; starting school later in the morning; longer years or shorter years. Every permutation has been on the table at one time or another in recent years, with varying success. A study released this spring by Matt Kraft and Sarah Novicoff, and published as an Education Next article last month, takes a look at how much time kids actually spend in school in the U.S. these days. They compare these findings internationally and within U.S. states, while synthesizing what the best studies have to say about time use in education.
First, they identify and summarize findings from seventy-four U.S.-based studies that use credible designs—such as randomized controlled trials, difference-in-difference, and regression discontinuity—and that include effects on test scores, which is their primary measure. They look at studies pertaining to extending the total time in school and then compare differences in increasing that total time as part of the school day versus the year. They find that total instructional time has moderate to large effects on test scores, with more robust benefits accruing when it is part of a larger strategy to improve instructional quality, such as letting go underperforming teachers or increasing school level expenditures. In terms of extending the school year, adding ten or more extra days yields a small positive increase in achievement in math and English language arts (ELA) scores. As for lengthening the school day, they see small- to medium-sized positive effects on student achievement. The overall patterns are consistent but show diminishing returns to additional time tacked on to the day, with larger increases in time showing larger overall effects in education systems that have fewer total hours to begin with.
Next Kraft and Novicoff dig into international comparisons, using data collected in 2021 by the Organization for Economic Cooperation and Development (OECD), which captures instructional time in elementary schools across OECD member countries. Elementary schools in the U.S. are, on average, providing 1,022 hours of instructional time across 180 days in the school year, equating to 5.7 hours per day. This ranks us near the top of the distribution of instructional time: eighth among thirty-seven countries. The U.S. achieves this ranking through the combination of a relatively long school day (ranked eighth) but a short school year (tied for twenty-fourth). Fifteen countries have school calendars that are at least two weeks longer than the average 180 days in the U.S., including Spain, Greece, and France.
Digging deeper into the American context, as of February 2023, sixteen states mandate both the length of the school year and the number of total hours, while ten states give districts the freedom to meet a minimum number of either days or total hours. Eleven require only a minimum number of days, and thirteen only a minimum number of total hours. This results in vastly different minimum school time requirements based on where kids live.
Among the thirty-seven states that identify a minimum number of days, the majority (twenty-eight) set the bar at 180 days, but it ranges from a low of 160 in Colorado to a high of 186 in Kansas. Thirty-nine states specify a minimum number of hours per year, with high school hours ranging from 720 in Arizona to 1,260 in Texas. High school students in Alabama, Florida, and Connecticut are only required to have 900 hours of school per year, while their peers in Maryland are required to have 1,170. Students attending schools at the 90th percentile of the number of hours per day are in school more than an hour longer than those at the 10th percentile (7.50 versus 6.33 hours). Schools at the 90th percentile for the number of days per year are in session two weeks more than schools at the 10th percentile. Cumulatively, the total number of school hours per year differs by almost 200 hours between schools at the 90th and 10th percentiles—a gap that equates to approximately five and a half weeks of schooling, or more than two full school years over the course of a K–12 career (yikes!).
Finally, Kraft and Novicoff discuss the marginal returns of raising minimum time in school requirements. Of course, more time in school costs more money and the returns to student achievement eventually taper off. So policymakers should look to increase time in the places where it is most lacking.
And it should go without saying, but I’ll say it anyway: More time ineffectively used is a waste of time. And no one needs that.
SOURCE: Matthew Kraft and Sarah Novicoff, “Time in School: A Conceptual Framework, Synthesis of the Causal Research, and Empirical Exploration,” Annenberg Institute at Brown University (February 2024).
#948: School choice setbacks: Interpreting the referenda losses with Colleen Hroncich
On this week’s Education Gadfly Show podcast, Colleen Hroncich, a policy analyst with the Cato Institute’s Center for Educational Freedom, joins Mike and David to discuss why pro–school choice ballot measures failed in Kentucky, Nebraska, and Colorado—and what it means for the future. Then, on the Research Minute, Adam shares a study examining 100 years of data on elite private and public colleges, revealing persistent gaps in socioeconomic diversity despite changes in racial and geographic representation.
Recommended content:
- Colleen Hroncich, Neal McCluskey, “Referendum Losses Are No Mandate against School Choice,” Real Clear Education (November 8, 2024).
- Juan Perez Jr., “Republicans’ big idea for remaking public education hits voter resistance,” Politico (November 27, 2024).
- Michael McShane “Op-ed: Despite blows, school choice swept the ballot this election,” Chalkboard News (November 14, 2024).
- Ran Abramitzky, Jennifer K. Kowalski, Santiago Pérez & Joseph Price, The G.I. Bill, Standardized Testing, and Socioeconomic Origins of the U.S. Educational Elite Over a Century, NBER (2024)
Feedback Welcome: Have ideas for improving our podcast? Send them to Stephanie Distler at [email protected].
Cheers and Jeers: December 5, 2024
Cheers
- A survey found that parents are becoming more skeptical of grades as markers of whether their kids are on track, a reasonable shift given rampant grade inflation. —The 74
- A Michigan program exposes participating high schoolers to one of more than a dozen trades, “from graphic design to criminal justice, robotics to cooking.” —NPR
- New Orleans Military & Maritime Academy—a charter school for eighth through twelfth graders—emphasizes “character development and the virtues of citizenship, respect for oneself and others, and patriotism.” —City Journal
- Public charter schools in Colorado are helping to desegregate the state’s school system. —Kevin Hesla, Colorado Sun
- New York City is expanding its FutureReadyNYC career education program to include thirty-six new schools and two new professions. —Chalkbeat New York
Jeers
- The U.S. saw a decline in its ranking on a 2023 international math exam, reflecting schools’ struggle to recover from pandemic-related disruptions. —Wall Street Journal
- A Wisconsin law that limited teachers union influence has been declared unconstitutional by a state appellate court. —Education Week
- Public schools in Newton, Massachusetts saw disastrous results when the district eliminated tracked classes in a misguided attempt to promote educational equity. —Boston Globe
What we're reading this week: December 5, 2024
- New findings challenge the assumption that obtaining a master’s degree will lead to higher financial returns down the line. —The Economist
- Linda McMahon’s trajectory on education issues reflects the GOP’s shift from federal accountability and intervention to a more culture-war-driven approach in education policy. —The 74
- Excessive government mandates and union control have undermined teacher and principal autonomy, leading to toxic school cultures that prevent successful reform. To improve schools, we must give educators back their agency in the classroom. —Philip K. Howard, Hoover Institution