Poland has been the economic tiger of Europe in recent decades and one of the fastest growing economies in the world over that time. In 1990, when I taught high school in a rural Polish town located in Silesia between Poznan and Wroclaw, Poland’s GDP was less than Ukraine’s. As noted recently by the journalist Anna Gromada, at the fall of the Berlin Wall in late 1989 “the 13:1 per capita gap between Poland and soon-to-be united Germany was twice that between the U.S. and Mexico.”
Between 1990 and 2018, however, Poland’s GDP increased almost eightfold. The Business Leader reported in late May that “the GDP per capita has already surpassed those of Portugal and is on track to beat the Italian (late-2020s) and even French levels (early-2030s).” And the Centre for Economics and Business Research predicts that Poland’s Gross National Income per capita will surpass that of the United Kingdom by 2030.
Many factors lie behind Poland’s impressive economic growth and its emergence as a “global power.” These include its embrace of economic freedom and innovation, robust infrastructure investment, access to European Union markets and investments, really hard-working citizens, and education. The power of education is most interesting to me, going back to when I taught there and saw the Polish commitment to it first-hand.
Polish teacher credentials from 1990–91 for the author.
Mateusz Urban, Senior Economist at Oxford Economics, argues that Poland’s “focus on education and skills development has helped them cultivate a talented workforce, which is a crucial asset in today’s competitive global landscape and I think crucial to their rise as a European powerhouse.”
Over the last thirty-four years, I have visited many times with my Polish wife and our two daughters. In May, we spent three weeks traveling across the country and saw first-hand the ongoing power of education to improve individual lives and a nation’s overall well-being. My wife’s family is large. She has seventeen nieces and nephews, and we met almost all of them and their families. They are the beneficiaries of Poland’s economic growth and educational opportunities.
Most of the next generation speak some English, and a few are completely fluent. They work hard across a variety of professions, including teaching (English!), banking, science, engineering, and trucking. They own homes and travel across Europe. One nephew speaks multiple languages and is completing his bachelor’s degree with honors in Wales. What’s remarkable is that my wife, one of seven children, educated during the Soviet period, is the only one of her siblings that attended college and speaks English.
The story of my wife’s family is very much the modern tale of Poland. Journalist Anna Gromada captured this when she observed, “Since 1989 my family has gone from farm laborers to high achievers.” We saw this transformation when we visited Krakow, founded in 1257 and on the list of UNESCO World Heritage Cities.
Yes, Krakow swarms with tourists who spend freely. But, according to the Warsaw Business Journal, “Krakow’s primary assets are its people, namely its well-educated workforce and professionals, which is largely what attracts businesses and investors. Multinational corporations choose Krakow not because it is cheaper, but because of the value of what it offers.”
But education in Poland is not only about creating a dynamic workforce for a world-class economy. Even more important, education is about helping the next generation understand what it means to be a Polish citizen and the responsibilities that come with it. During our travels, we visited historical sites in Gdansk, Warsaw, and Krakow (as well as lesser-known towns like Hel on the Baltic Sea and Zakopane in the Tatra Mountains).
At all these places, we saw school children in their yellow hats or blue t-shirts visiting historical sites with their teachers and local historians and experts. We listened in to conversations between adults and students in places like the Warsaw Uprising Museum, the Museum of Warsaw, and the Jagiellonian University (founded in 1384 and attended by the likes of Nicolaus Copernicus, Pope John Paul II, and Nobel Prize Winning poet Wislawa Szymborski).
Polish students in Krakow.
Poland has had to fight for its existence over the centuries. As it grows into a dynamic Twenty-first-century power, it lies on the free world’s outer edge. It has a 142-mile border with autocratic and hostile Russia in the north, and to the east, it borders Russia’s ally Belarus and war-torn Ukraine. Much of the Western support flowing to Ukraine flows through Poland. Not surprisingly, according to the Warsaw Business Journal, “83 percent of Poles believe that the war in Ukraine poses a threat to Poland’s security.”
The Polish education system is responding to this threat. The BBC reported on June 3 that, “In a school just outside Warsaw, children have been learning survival skills. It’s part of a new program that’s sending soldiers from the Territorial Defense to teach emergency drills in classrooms across the country.”
In 1990, as I struggled to teach English to my Polish students, we would periodically hear the Russian jets based nearby fly over our school. They’d break the sound barrier to rattle our windows and classrooms and maybe our nerves. But we all knew the Russians were in retreat. Fast forward thirty-four years and Poland is a powerful country that is a beacon for what Ukraine wants—freedom, economic opportunity, and better lives for their children. It is also what Russia despises and fears. Education matters, and Poland proves it.
As excitement grows around tutoring as a strategy to combat learning loss, advocates have rightly been encouraged by the growingbodyof evidence demonstrating the efficacy of tutoring interventions. To date, however, little research has examined the impact of fully virtual tutoring on very young students. Hardly a technicality, this distinction matters because younger children are less likely to have the technical and self-regulation skills upon which virtual learning depends. Now, a new study by researchers from Stanford, Vanderbilt, and UnboundED analyzes the benefits of virtual tutoring specifically for early elementary students.
The authors conducted a randomized controlled trial with 2,085 K–2 students at twelve Texas schools within the same charter network. Those in the sample were randomly assigned to participate either in 1:1 tutoring, 2:1 tutoring, or a control group; the tutoring provider, OnYourMark Education (OYM), is a partner of the unnamed charter network and a science-of-reading-based virtual tutoring program. Students in the 1:1 and 2:1 groups participated in in-school, virtual tutoring for twenty-minute periods, four days a week, from September 2022 until May 2023. For their main measure, the researchers compared students’ beginning-of-the-year performance on a widely used exam, Dynamic Indicators of Basic Literacy Skills (DIBELS), to their end-of-the-year performance on the same assessment. The analysis controlled for demographic factors like gender, race and ethnicity, and economic disadvantage, and the authors also broke down their findings to understand OYM’s effects on students with differing baseline performance and in different grade levels.
Overall, the results show that OYM produced statistically significant reading gains for participants. On average, the students who received the OYM treatment improved their scores by 0.05–0.08 standard deviations. Gains were slightly larger for those in the 1:1 group, a finding in line with other research on 1:1 tutoring.
The effect sizes varied by subgroup. Perhaps most notably, students with the poorest baseline scores saw the largest gains (0.11 standard deviations), and this trend was especially true for the lowest-skill students in the 1:1 group (0.15 standard deviations). Given that this finding was statistically significant, 1:1 virtual tutoring could be a worthwhile intervention for young readers struggling the most. In the results disaggregated by grade level, first graders saw the greatest reading growth, followed by kindergarteners, followed by second graders. This result probably tells us more about the tutoring program and study alignment than it does about student reading skills: OYM focuses on foundational reading skills, whereas by second grade, most assessments include a greater focus on comprehension skills.
Readers will want to interpret all these findings cautiously. First, the student sample was not entirely random (although the assignment of the groups within the sample was random). Prior to the creation of the sample, staff in the twelve schools each selected ten students who most needed tutoring support; these students then participated in OYM that year but were excluded from the study. This limitation suggests that the study findings may actually have been conservative, as the lowest-skill students tended to gain the most from tutoring, and the students hand-selected by staff for guaranteed participation were likely among the lowest-skilled.
A more serious threat to the study’s internal validity surrounds the participation of students with disabilities and multilingual learners. After students were assigned to the three study groups, numerous students from these groups withdrew from the OYM intervention due to a scheduling conflict with their federally mandated services. As a result, there was moderately high attrition, and the patterns of attrition were not random. To account for this issue, the researchers ran additional calculations for a “preferred sample,” which excluded all 731 multilingual learners and students with disabilities—a large proportion of the study’s sample of 2,085. Still, both samples are somewhat problematic, as the “full” sample suffers from disproportionate attrition, and the smaller sample cannot speak to effects on students with disabilities and multilingual learners. This is especially unfortunate, as the study’s implications would be particularly important for these subgroups, which experience substantial and enduring achievementgaps compared to their peers.
Yet the findings remain encouraging, suggesting that many young readers can benefit from virtual tutoring, a more affordable and often more logistically feasible intervention than in-person tutoring.
Forty-five percent of U.S. public schools report feeling understaffed, 70 percent report that too few candidates are applying to teaching vacancies, and 86 percent report challenges hiring teachers in the 2023–24 school year. In recent years, several states have attempted to address these problems by issuing emergency teaching licenses, expanding alternative certification programs, and pursuing other strategic staffing solutions. A new working paper by Mary E. Laski, a Harvard researcher, analyzes one such solution—a pilot program in which principals hand-select experienced staff members to fill classroom vacancies.
The study focuses on a three-year alternative performance-based licensure (PBL) pilot program implemented in eight Mississippi school districts beginning in 2019. The program allowed principals to use their professional judgment to identify experienced staff members—such as teaching assistants, instructional aides, or para-educators—to fill vacant teaching positions in their schools. Though staff members were still required to meet the bachelor’s degree requirement to be promoted, they were not required to pass the state licensing exam before taking on the lead teaching role. In order to participate in the PBL pilot, principals were required both to randomly assign students to different classrooms within each grade level, and to identify a comparison teacher at the school for each PBL candidate they promoted. Across the participating districts, 126 educators were promoted to regular teaching positions during the three-year period. These teachers were almost all Black, with a median of seven years of experience at their schools.
The researchers used detailed administrative data provided by the Mississippi Department of Education on both teacher and student performance, including demographic information, teacher observation scores and retention rates, student course schedules, assessment scores, and attendance records. A regression analysis was conducted to compare the observation scores and retention rates of PBL teachers to those of the principal-selected comparison teachers, as well as those of an additional group of teachers working on emergency or provisional licenses in similar schools (i.e., those who would otherwise have filled the vacancies, in the absence of the PBL pilot program). The analysis also compared the effects of PBL teachers on student outcomes with both groups of comparison teachers. All models controlled for previous-year test scores, previous-year absences, and student demographics.
The study found few statistically significant differences between PBL and non-PBL groups across different measures, indicating that PBL teachers performed, on average, at the same level as principal-selected comparison teachers and teachers working on emergency licenses in comparable schools. When differences were statistically significant, they favored PBL teachers. One such difference was that students of PBL teachers were absent about 12 percent less often than students of principal-selected comparison teachers. In addition, PBL teachers’ students performed about 0.2 standard deviations higher than comparable peers on annual state math assessments, and about the same as comparable peers on ELA assessments. PBL teachers also scored about 0.1 points higher than teachers with emergency licenses in comparable schools across most teacher observation standards, and were more likely to continue teaching in subsequent years than comparison teachers.
Naturally, there are some caveats. For example, the author acknowledges that some of the confidence intervals found in the analysis are large, making it difficult to rule out the possibility that PBL teachers may be less slightly effective than comparison teachers; however, the magnitude of any such difference in effectiveness would be small. In addition, though initial classroom assignments were random, principals had the ability to alter student assignments after randomization, if necessary, which suggests a possibility that the analysis may not have included truly random groups of students for comparison across PBL and non-PBL classrooms.
Nonetheless, the study’s findings suggest, overall, that PBL programs present a promising approach to filling teaching vacancies without compromising the quality of the education that students receive. The study also suggests that PBL teachers may be, on average, more diverse than comparable groups of teachers, suggesting that PBL programs may have some benefits related to diversification of the teacher workforce.
Additionally, the results have important implications related to the licensure process for individuals already working within the school system in different capacities. If PBL teachers—who did not pass licensing exams—perform at the same level as comparable teachers who did pass licensing exams, the professional judgment of school leaders may be an acceptable replacement for licensing exams that may otherwise present a barrier in promoting experienced education professionals to teaching vacancies that schools are struggling to fill with traditionally licensed candidates.
After years deconstructing their discipline structures, many No Excuses schools are rediscovering the need for strict behavioral codes. —RealClearInvestigations
A school nurse meets parents in the parking lot to determine whether their kids are healthy enough to attend school—curbing absenteeism. —NPR
Louisiana is piloting a new standardized reading test that prioritizes content knowledge and specifies which books and content it will include ahead of time. —Chad Aldeman, The 74
Street Data, a book that is skeptical of using data to analyze schools, reflects a broader distrust of evidence-based decision making. —Steve Rees, Education Next
Analysts continue to suggest that NCLB-era reforms failed despite clear evidence that they boosted student scores. —Jack Jennings, Education Week
A proposed policy in Denver would prohibit the superintendent from using low enrollment or poor test scores as conditions for school closures or consolidations. —Chalkbeat
Education reform is challenging because it’s unclear what “better schools” or “challenge the teachers unions” even means. —Matthew Yglesias, Slow Boring
Young people must be told that high school does not determine their destiny, that whether they were failures or high achievers needn’t shape who they are as adults. —Megan Stack, New York Times
The Tories’ focus on basic literacy and numeracy is paying dividends in UK elections. —Chris Bryant, Bloomberg
Last year, state officials published some troubling data related to Ohio’s teacher workforce. They revealed that fewer young people are entering the profession, teacher attrition rates have risen, and worrying shortages exist in specific grades and subject areas.
These were positive steps forward. But they focus primarily on bringing new teachers into the profession. That’s only one part of the equation. Let’s reexamine two previously pitched ideas that could help retain the teachers we already have.
1. Allow districts to pay teachers more flexibly by eliminating mandatory salary schedules based on seniority and credits earned.
Low salaries are often cited as a reason why students steer clear of the teaching profession and why some teachers choose to leave the classroom. Given the importance of teachers to student success, state and local leaders should be looking for ways to boost teacher pay regardless of the retention implications. But absent a sudden (and permanent) influx of cash, it will be difficult to raise salaries across the board. That’s where flexible salary schedules come in.
Under current law, districts must adopt teacher salary schedules based on years of service and training. These systems have the benefit of simplicity, as teachers can predict with some certainty when and how they will receive a raise. But there are also drawbacks. Mandatory schedules typically prescribe lowstarting salaries to early career teachers, which can drive young talent—especially teachers with considerable student loan debt—out of the profession. Rigid schedules thwart districts’ ability to reward high performers based on effectiveness or increased responsibility, which could prod our best and brightest into other professions. Salary schedules also make it difficult to bolster specific high-need areas, like special education. The number of students with special needs isrising, and special education is a potential shortage area in Ohio. Offering special education teachers additional pay could help with recruitment and retention, but current law and collective bargaining agreements make that difficult.
Repealing salary-schedule requirements would empower districts to determine how best to pay their instructional teams. State leaders should not make flexible pay mandatory. Schools should be able to stick with traditional rigid schedules if they prefer. But making room for greater flexibility would allow leaders to strategically allocate funds in a way that better serves students and helps recruit and retain staff.
2. Create a refundable tax credit for teachers working in high-poverty schools.
High-poverty schools need effective teachers. But issues such as lower salaries, inadequate support from administration, school culture and student discipline issues, and high turnover rates hinder schools’ ability to recruit and retain talent. Addressing all these problems would require a host of structural reforms, but the state could alleviate concerns about low salaries by creating a tax credit for teachers who work in high-poverty schools. Such a program would allow state leaders to put more money directly into teachers’ pockets without getting tangled up in salary negotiations with local unions and school boards. It would also provide a modest financial incentive for teachers to choose to work in high-need schools.
***
To strengthen Ohio’s teacher workforce, state leaders need to invest in a variety of initiatives. There are plenty of ideasworthconsidering, but making sure that teachers have more money in their pockets should be at the top of the list.
Editor’s note: This was first published by Forbes.
With graduation season in full swing, many students and their families are deservedly celebrating a major milestone. Successfully completing high school or college is a great accomplishment; but when “Pomp and Circumstance” fades into the background and the graduation parties are over, how do we know whether education has actually provided students with what they need for their futures?
The true marker of success in education shouldn’t be the graduation cap; it’s what happens after the tassel is turned.
Traditionally, our measures of educational outcomes have been pretty basic and not directly related to how graduates are using their education to improve their lives and their livelihood beyond school. Success has largely been measured by the number of students who graduate from high school or a postsecondary program, along with test scores designed to assess their knowledge and skills at a given point in time. These things are undoubtedly important, but graduation rates and test scores alone do not tell the full story of whether education is equipping students with the knowledge and skills necessary to build well-paying careers and actively contribute to their communities.
It’s time to think differently about how we measure success in our education systems—both K–12 and higher education. If the ultimate goal of education is to prepare students for successful futures after they leave our schools, we should prioritize those long-term outcomes.
A new report from Education Strategy Group and American Student Assistance examines how all fifty states are approaching the complexities of measuring long-term success in both K–12 and higher education. Though many states are making good faith efforts to capture data to better understand how education impacts students in the next phase of their lives, the report finds that too few are currently attaching meaningful incentives to reward schools and colleges for improving students’ postsecondary and workforce success.
More specifically, at the K–12 level, while many states now include college and career readiness metrics in their performance goals for high schools, just eight states extend their models to include measures of how high school graduates ultimately do after they graduate and get to college and the workforce. In higher education, only six states use measures of how students do in the workforce after leaving postsecondary institutions to inform funding decisions.
If we want to ensure that K–12 graduates do well in college and careers, and that college graduates, in turn, do well in the labor market, leaders at all levels of the education system need better, timelier and more detailed information about those outcomes. That information also needs to be attached to real incentives—including serving as part of funding and accountability models—to drive change. Leaders can only make changes to improve outcomes if they first have an understanding of what those outcomes are; otherwise, they are flying blind.
Part of the challenge in doing this well lies in the historical siloes that exist between K–12 education, higher education and the workforce. Each sector has developed its own measures and data systems over time, making it difficult to connect them and track a student’s progress and outcomes over time and across sectors.
Fortunately, a few states are leading the way in demonstrating what it looks like to build the infrastructure needed to understand long-term outcomes. Kentucky’s KYSTATS database sets a gold standard for data systems by collecting and integrating education and workforce data to offer policymakers and the public a more complete picture of how the systems connect to one another. This resource is a one-stop shop for understanding how the education to workforce continuum in Kentucky—from K–12 to postsecondary to employment—is serving Kentuckians. As an early leader in this work, Kentucky has offered a blueprint for other states to emulate in building their own systems to understand long-term outcomes.
With better systems in place, more states can and should hold themselves, their schools and their colleges accountable for the outcomes that matter the most for students’ long-term economic well-being. Though there is a long way to go, a select few states are stepping up in meaningful ways.
Vermont is the only state to include employment outcomes in its federal K–12 school accountability system. The state’s Post-Secondary Outcomes indicator measures the percentage of graduates who enroll in college or trade school, enlist in the military, or work full time in a job. By attaching this measure to formal accountability, Vermont plans to hold K–12 schools partially responsible for setting students up for long-term success.
On the financial side, Texas is leading the way by providing incentive funding to both K–12 districts and higher education institutions attached to the long-term measures that matter most. For K–12 districts, the state funds a College, Career, and Military Readiness Outcomes Bonus by which districts can earn bonus funding when they increase the number of high school students who enroll in higher education or complete an industry-recognized credential. Similarly, the state’s new outcomes-based funding model for community colleges provides funding based on the number of learners who earn credentials of value rather than the traditional approach of funding colleges based on the number of students enrolled in classes.
These changes in performance goals and funding levers are making a difference with schools. The financial bonuses for Texas high schools have driven higher participation rates in the college and career readiness offerings, and although it’s early, the new funding formula for community colleges is shifting the emphasis toward credential attainment and readiness for careers.
Long-term outcome measures aren’t only important for driving the performance of education institutions. This information needs to be shared with students and their families so that they can make informed choices about which options to pursue.
Some of the states with the best information are taking steps to use it to empower consumers, particularly when it comes to making informed choices about higher education.
Kentucky’s Students’ Right to Know dashboard, which is powered by KYSTATS, is a student-facing tool that allows users to see job projections, salary information, and where programs are offered for different majors. Even more impressively, Kentucky and its neighbors in Ohio, Indiana and Tennessee are addressing the challenge of gathering wage data for college graduates who move out of state by linking data in what they are calling a Multi-State Postsecondary Report.
Colorado, too, is helping students and institutions to understand the potential return on investment in higher education. Students can use the earnings outcomes dashboard to make informed decisions based on projected earnings by institution and major. The state also produces an annual return on investment report and is working to develop a “minimum value threshold” to ensure that institutions are only offering programs that will pay off for students.
While data and measurement might not be the flashiest topics in the education debate, they are among the most foundational. Centering long-term outcomes as core drivers of our educational priorities will help expand economic mobility for all. We need to think differently about success—and do a better job of measuring it—to improve outcomes for every student.
News stories featured in Gadfly Bites may require a paid subscription to read in full. Just sayin’.
Fordham-sponsored charter school Columbus Collegiate Academy – Main is featured heavily in this national piece on the No Excuses model of school operation. There’s a lot here about the origins, concerns, and evolution of the movement, as well as asking if the idea is ripe for a comeback…in those schools whose adult leaders are able to thus adapt. (*Spoiler alert* - they mean charter schools.) At CCA, though, it seems like the “movement” is more about a dedicated and compassionate group of teachers and staff members doing what’s right for their students (“We outscored [the Columbus district schools] because we work hard and stuck with our mission of high expectations even though it is not super popular,” says a wise CCA network leader.), including loosening the rigid rules a little when that promises to help move kids forward more than strict adherence will. Nice. (Real Clear Investigations, 6/10/24)
In the piece above, I was struck by the words of a CCA middle schooler who talked about the difference between her new charter school’s culture and that of her previous middle school. She was bullied at the district school, she said, because she worked hard in class and wanted to be successful, and the teachers and staff there were not helpful in stopping the harassment aimed at her for this work ethic. Having a school where the adults are just as invested in her success as she is makes all the difference for her. I have heard these sentiments before and assumed them to be endemic in schools whose students are not performing well: The kids are fully capable of high achievement, but the culture put in place and reinforced by adults does not support them in pursuing their greatness. I am glad to see from this piece that this is not fully the case. Meet Docille Micomyiza, a graduate of Northland High School. His educational path was rough, arriving in the United States as a refugee from Rwanda without knowing the language and having difficulty fitting in. Fast forward to today where he is fluent in English, a high school graduate, a community college student, and a rental property owner earning some money and learning the ropes at the same time. Fantastic! It is a little unclear from this profile just how much his Columbus City Schools high school really helped him (although they are given a lot of credit), because the young man was extremely self-motivated, had some family support, and dedicates much of his success so far to his religious faith. But at least they don't appear to have hindered him much. He also took advantage of a lot of district-adjacent outside opportunities, including College Credit Plus (free early college courses), the Columbus Promise program (free tuition at Columbus State Community College for Columbus district students (and only district students—no charter, STEM, or private school grads need apply), and Nationwide Insurance’s internship program (a paid part-time gig in the financial services sector—working around his Columbus State schedule, which is only available to a small subset of district high schoolers). So, again: Kudos to a young man leveraging every opportunity he finds, but imagine how much farther this bright star—and how many others just like him could already be—if the entire education system was working in sync with him? There really is no excuse for it not to. (Columbus Dispatch, 6/11/24)
Let’s start this clip with a reminder: Ohio’s voucher grouchers are really really mad that the near-universal expansion of EdChoice has, in its first year, attracted mainly families whose children were already in private schools. (This makes sense to me, given the tricky logistics of moving kids from school to school which I know from experience, but it’s not like the grouchers wouldn’t have found a reason to be mad even if the situation played out differently, right?) To them, I will say: Just wait, gang. Things are only getting started. To wit: Granville Christian Academy is on the grow, buying up a building near their campus to house the new and larger group of K-2 students enrolling for this fall thanks to the expanded availability of vouchers. (Yeah, they actually said that! Out loud and on the record.) (Newark Advocate, 6/11/24)
For more than twenty-five years, Ohio’s public charter schools have served as an educational option for families and students. One of the most routinely debated questions is whether charters provide a superior education when compared to the district alternative. Just prior to the pandemic, Fordham research showed that students attending brick-and-mortar charters in Ohio made significantly greater academic progress than their peers attending nearby district schools.
Conducted by Fordham’s Senior Research Fellow, Dr. Stéphane Lavertu, this research brief provides an updated analysis of brick-and-mortar charter school performance in the years after the pandemic (2021–22 and 2022–23). He finds that, while their advantage has slightly diminished, charters continue to outperform districts in the post-pandemic years.
For more details, download the full report (which includes technical appendixes), or read it below.
Foreword
By Aaron Churchill
Over the past two decades, researchers have spent countless hours studying the impacts of public charter schools—independently-run, tuition-free schools of choice that serve some 3.7 million U.S. students today. Just prior to the pandemic, studies from Ohioand nationally indicated that charters on average delivered superior academic outcomes compared to traditional districts. And the very finest charters in Ohio andaround the nation were driving learning gains that gave disadvantaged students the edge needed to succeed in college and career.
The pandemic scrambled most everything about K–12 education. But did it upend what we know about charter school performance? The present study, conducted by Fordham’s Senior Research Fellow Dr. Stéphane Lavertu, examines the post-pandemic performance of Ohio’s brick-and-mortar charter schools, which enrolled 81,000 students—mostly from urban communities—during the 2022–23 school year.
Dr. Lavertu’s analysis of state value-added data indicates that the charter school advantage has persisted in the wake of the pandemic. In 2022–23, the average brick-and-mortar charter school delivered the annual equivalent of roughly 13 extra days of learning in English language arts and 9 extra days in math. While the size of the ELA advantage (though not math) has diminished since the pandemic, the average Ohio charter school still outperforms nearby district schools.
There, of course, remains tremendous work ahead to help all students—district and charter alike—to achieve at high levels after pandemic-related disruptions. But one thing’s for sure: Supporting—and investing in—high-quality public charter schools remains a strong, evidence-based approach that Ohio should continue to embrace.
Aaron Churchill
Ohio Research Director
Thomas B. Fordham Institute
Introduction
Ohio’s brick-and-mortar charter schools were going strong prior to the pandemic. Rigorousstudies demonstrated that student test scores in English language arts (ELA) and mathematics improved significantly when students attended charter schools instead of nearby district schools. Disciplinary incidents and absenteeism also declined significantly. Such results were not mere correlations. These studies used research designs that plausibly unearthed charter schools’ causal impacts on student outcomes. Thus, just prior to the pandemic in the spring of 2020, there was little doubt that Ohio’s brick-and-mortar charter schools were, on average, high-quality alternatives to district schools, particularly for lower-income families residing in our state’s large urban districts.
The pandemic hit the typical school hard, but it hit charters even harder. Charter operators noted that tight labor markets and substantially lower funding (about $5,000 less per pupil than nearby district schools at the time) made it difficult to recruit and retain teachers during these turbulent years. Additionally, the economically disadvantaged students charters primarily serve were disproportionately derailed by the pandemic, in part because of school closures and transportation issues. Researchshows that urban district and charter schools serving lower-income students experienced much steeper declines in test scores than the average Ohio public school.
With the pandemic in the rearview mirror, do Ohio’s brick-and-mortar charter schools remain a quality educational option? To answer that question, we need to determine whether the average Ohio charter school student is still learning more during the school year than they would in a nearby district school. Fortunately, Ohio’s publicly available school “growth” measures can help us make such a determination.[1]
This research brief examines how the average achievement growth of Ohio’s charter students compares to that of the average district student. The results reveal that, in terms of student achievement growth, Ohio’s charter schools remain a better educational option for the average charter student. During the 2022–23 school year, the average charter student experienced achievement gains in ELA and math that were approximately 0.02 of a standard deviation greater than students in nearby district schools. That is comparable to the average achievement impact of 0.03 standard deviations associated with increasing school budgets by $1,000 per pupil for four years; but, in the case of Ohio’s brick-and-mortar charters, those gains are realized every one or two years and at no additional cost. Indeed, during the years of this study, the achievement gains came with a significantly lower cost compared to district schools.[2] The estimated charter advantage is roughly equivalent to charter students participating in an additional 13 days of learning in ELA and an additional 9 days of learning in math every school year.
It is important to keep in mind that charter school performance is not what it was in 2019, and the average charter school student experienced significant learning loss during the pandemic. However, the results of this analysis imply that the pandemic-induced achievement gap between higher- and lower- income students would have been even worse without charter schools.
Estimating the impact of Ohio’s charter schools on student achievement
To estimate the causal impact of charter schools, we need to compare their pupils’ learning to the learning of identical students who did not attend charter schools. In other words, we need to make sure that the only relevant difference between charter and district students is the school they attended. It seems like an impossible task, as students who attend charter schools can be quite different from students who remain in district schools. Statewide, charter students are disproportionately economically disadvantaged, and a basic comparison of their test scores to the statewide average doesn’t tell us much about the actual effectiveness of their schools. Comparing students attending charter and non-charter schools in the same district (as opposed to statewide) helps in making apples-to-apples comparisons of their outcomes, as these students are more alike. But significant problems remain. For example, parents who are aware of charter schools and navigate the process of enrolling their children are likely quite different than those who do not. They likely confer knowledge, skills, and other behavioral attributes to their children that lead to higher student achievement, which one might falsely attribute to charter schools when conducting a simple comparison of charter and non-charter students within a district.
Fortunately, research has shown that focusing on achievement gains—what some refer to as student “value added” or “academic growth”—can yield estimates of school effectiveness that are minimally biased.[3] Value-added estimates hold constant a student’s achievement level as of a prior school year, effectively controlling for all differences between students that contributed to different educational outcomes up to that point in their lives. For example, an annual value-added estimate for grade 4 holds constant students’ grade 3 achievement level, which captures what happened in their life that led to their grade 3 achievement, from how well-nourished they were while in the womb, to how much stress they experienced growing up, to the quality of the educational experiences they had through third grade. Research indicates that so long as one focuses on students residing in a similar geographic area, comparing value-added achievement gains between charter and district students should get us close to the impact estimates we would get from an experiment that randomly assigned students to charter and traditional public schools (the “gold standard” for estimating causal impacts).[4]
Ohio’s school-level “growth” measure captures the average test-score gains of schools’ students relative to the average Ohio student. Consequently, one can compare the effectiveness of charter and traditional public schools operating in the same district by comparing these school-level estimates, which are publicly available on the Ohio Department of Education and Workforce (ODEW) website. The primary drawback is that these estimates are scaled differently than those in the academic literature, which makes it difficult to benchmark effect sizes. The appendix describes how I adjusted “gain” estimates (from 2018–19) and “growth” estimates (from 2021–22 and 2022–23) to get the proper scale and render pre- and post-pandemic estimates comparable.[5] It also describes how I used these rescaled estimates to create ELA, math, and science “composite” value-added scores that are comparable to those on Ohio’s school report cards.
The analysis below compares student achievement gains between brick-and-mortar charter and traditional public schools operating in the same district.[6] Note that these estimates are weighted by the number of tested students to account for measurement error. That enables us to speak in terms of the average charter student (as opposed to the average charter school) and approximates the estimates one would get using a student-level dataset. The appendix provides a full accounting of the methods and results that underlie the figures.
Charter performance during the 2021–22 and 2022–23 school years
Takeaway 1: Brick-and-mortar charter schools continue to yield greater achievement gains than nearby district schools, but their advantage in English language arts is smaller than in 2018–19.
Figure 1 (below) reports the estimated difference in achievement growth between students in charter schools and students attending traditional public schools in the same district. Specifically, it presents “composite” estimates for English language arts (combining results from Ohio state tests in grades 4–8 and the high school ELA 1 and ELA 2 exams), mathematics (combining state tests in grades 4–8 and high school Algebra I, geometry, Integrated Mathematics I, and Integrated Mathematics II), and science (combining state tests in grades 5 and 8 and the high school biology exam).[7] The figure disaggregates estimated effects by subject so that pre- and post- pandemic estimates are comparable. This disaggregation compromises statistical precision and, thus, yields some estimates that do not quite reach conventional level of significance. But the comparisons are nonetheless informative.
The figure indicates that, compared to their counterparts in district schools, students attending charter schools in 2022–23 had test score gains that were 0.021 of a standard deviation greater on ELA exams. Thus, after a notable dip in 2021–22, a charter advantage reemerged in 2022–23. However, that advantage is less than half the size it was in 2018-19—a statistically significant drop. The results also suggest that after an initial dip in 2021–22, charter schools regained their advantage in math of approximately 0.02 standard deviations in 2022–23.[8] The math estimate narrowly fails to attain conventional levels of statistical significance (hence the empty bar) but the results nonetheless suggest that the charter advantage in math is back to pre-pandemic levels.[9] Finally, charter schools performed comparably to nearby district schools on science exams, both before and after the pandemic.
To validate these results, I re-estimated the models using the composite value-added measures publicly available on Ohio’s school report cards (labeled “effect sizes” on the report cards). These additional analyses confirm the value-added estimates in Figure 1 and reveal that pooling all tested subjects across both post-pandemic years (2021–22 and 2022–23) yields a statistically significant charter school advantage of approximately 0.017 standard deviations.[10]
Figure 1. Differences in achievement gains between charter and district students
Translating these estimated charter school effects into the more intuitive “days’ worth of learning” metric is not straightforward, in part because of the inclusion of high school exams that are not administered in consecutive years. Nevertheless, with this caveat in mind, the estimates for 2022–23 are the equivalent of charter students participating in an additional 13 days of learning in ELA (as compared to 28 days as of 2018–19) and 9 additional days of learning in math (as compared to 10 days as of 2018–19).[11] These are rough approximations, but they provide some intuition about the magnitude of the post-pandemic charter school achievement advantage in ELA and math.
Takeaway 2: Students in brick-and-mortar charter schools experienced large gains on high school exams (relative to students in nearby district schools), which helped sustain the charter advantage since the pandemic.
The analysis below disaggregates the estimates in Figure 1 by grade band and subject. Figure 2 (below) focuses on student achievement in grades 4–8 and, to maximize statistical power, pools the value-added estimates from the 2021–22 and 2022–23 school years.[12] These estimates capture annual learning gains. For example, during the 2018–19 school year, the average charter school student gained an extra 0.033 of a standard deviation per year in ELA compared to their peers in district schools. The results in Figure 2 indicate that charter schools’ pre-pandemic advantage for students in grades 4–8 was wiped out in 2021–22 and 2021–23. In other words, considering all post-pandemic years together, charter school students in grades 4–8 learned no more (but no less) than their counterparts in district schools.
Figure 2. Differences in achievement gains between charter and district students (grades 4–8)
Figure 3 (below) presents the results for tests that students typically take in high school, including some for which we have no value-added estimates for the 2018–19 school year (biology, government, and history).[13] It indicates that the charter school advantage for the 2021–22 and 2022–23 school years (indicated in Figure 1) is driven predominantly by student achievement on high school exams. Interpreting the effect sizes for high school exams is less straightforward than interpreting the annual achievement gains presented in Figure 2, as the value-added estimates for high school often capture multiple years of learning as opposed to annual achievement gains.[14] The difficulties with comparing the estimates in Figure 2 and Figure 3 do not detract from the primary take-home message, however: The 2018–19 charter advantage in grades 4–8 shifted to an advantage in high school during the 2021–22 and 2022– 23 school years. This shift underlies the ostensibly stable charter school impact estimates for mathematics that appear in Figure 1.
Figure 3. Differences in achievement gains between charter and district students (high school)
It is important to reiterate that both charter and district students lost significant ground during the pandemic, and that these analyses compare achievement gains between students in charter and district schools. Thus, the large “increases” in charter school value-added on high school exams are due in part to charter students losing less ground than their counterparts in district schools, as opposed to an absolute improvement in the achievement of charter school students. Similarly, the estimates for grades 4–8 indicate that the annual achievement gains of charter students became identical to those of district students, which is a “decline” in the sense that charters lost the (substantial) advantage in achievement gains their students enjoyed as of 2018–19.
Takeaway 3: Brick-and-mortar charter schools’ achievement advantage in grades 4–8 appears to be on the rebound.
The analysis above pools value-added estimates from the 2021–22 and 2022–23 school years to enhance statistical power. But were there changes in the impact of charters between 2021–22 and 2022–23? Although Ohio’s value-added estimates are generally too imprecise to estimate impacts annually for particular grade bands and subjects, examining the differences in effects between the two post-pandemic school years suggests some interesting patterns. Figure 4 indicates that, from 2021–22 to 2022–23, charter schools’ value-added (relative to district schools) increased by 0.025 of a standard deviation on ELA exams (from -0.008 to 0.017) and 0.022 of a standard deviation on math exams (from -0.020 to 0.002) in grades 4–8. These results contrast with a relative decline from the exceptionally high 2021–22 value-added for high school grades that Figure 5 presents.[15]
It is worth emphasizing that one should not make too much of the estimates in Figures 4 and 5 (below), as they are too imprecise to draw conclusions with statistical confidence (hence the empty bars). For example, the estimate of 0.017 for ELA in grades 4–8 narrowly fails to attain statistical significance; we cannot rule out a substantively significant effect.[16] A charter student with an extra 0.017 of a standard deviation in annual ELA achievement in grades 4–8 would have accumulated an achievement advantage of 0.085 of a standard deviation (5 years x 0.017) by the end of eighth grade. That equates to approximately an additional 49 days’ worth of learning after five years. So, these by-year analyses do not enable us to rule out substantively important charter school effects. Figures 4 and 5 illustrate trends, as opposed to making a conclusive statement about the true impact of charter schools in 2022–23.
Figure 4. Differences in achievement gains between charter and district students (grades 4–8)
Figure 5. Differences in achievement gains between charter and district students (high school)
Summary and implications
The analysis indicates that the average student in Ohio’s brick-and-mortar charter schools continues to learn more than they would in nearby district schools. Indeed, the achievement gains that charters generate in English language arts and mathematics would likely be expensive to realize if, instead of charter schools, we sought to achieve them by increasing funding of district schools. Make no mistake: Students in charter and urban district schools experienced larger achievement declines during the pandemic than students in the average Ohio public school. Charter school students indeed fell behind during this period. However, the results of this analysis imply that the pandemic-induced achievement gap between higher- and lower-income students would have been even worse in the absence of charter schools.
Setting aside pandemic-related learning loss, the results above also serve as a reminder that there is room for improvement in Ohio’s charter sector. Ohio’s charter schools do not provide quite the same achievement advantage over district schools that they did prior to the pandemic.[17] Ohio’s charter schools also remain less effective than those in states such as Massachusetts, New York, and Rhode Island. As of 2019, charter students in those states posted annual achievement gains equal to approximately 41, 75, and 90 extra days’ worth of learning each year, respectively. As I note above, however, as of 2022–23, the charter school advantage in Ohio equals approximately 13 additional days of learning in ELA (as compared to 28 days as of 2018–19) and 9 additional days of learning in math (as compared to 10 days as of 2018–19).
It may be that what has hindered Ohio charter schools is their limited funding compared to nearby district schools, which, among other things, has made it particularly difficult to recruit and retain teachers. Ohio’s substantial funding increases for high-quality charter schools in 2023–24 reduces the funding disparities significantly for these schools. Given what we know about the causal impact of funding increases when the money is spent well, these changes may enable significant improvements in the performance of Ohio’s charter schools.
About the author and acknowledgments
Stéphane Lavertu is a Senior Research Fellow at the Thomas B. Fordham Institute and Professor in the John Glenn College of Public Affairs at The Ohio State University. Any opinions or recommendations are his and do not necessarily represent policy positions or views of the Thomas B. Fordham Institute, the John Glenn College of Public Affairs, or The Ohio State University. He wishes to thank Vlad Kogan for his thoughtful critique and suggestions, as well as Chad Aldis, Aaron Churchill, Chester E. Finn, Jr., and Mike Petrilli for their careful reading and helpful feedback on all aspects of the brief. The ultimate product is entirely his responsibility, and any limitations may very well be due to his failure to address feedback.
Endnotes
[1] The analysis focuses on “site-based” brick-and-mortar charter schools serving “general” or “special” education students. It excludes virtual schools and brick-and-mortar schools focused on dropout prevention and recovery, as there are no district schools comparable to these other types of charter schools.
[2] Ohio significantly increased charter school funding for the 2023–24 school year.
[3] This validation is based on studies that follow students who attended the same elementary schools and, upon reaching the terminal grade of that school, transitioned to charter and traditional-public middle schools. As I discuss below, such a “difference in differences” analysis—which must focus on middle schools—yields 2016-2019 results that are similar in magnitude to those presented in the analysis below.
[4] Within-district comparisons are particularly important in the wake of the pandemic, as recovery from learning loss varies significantly across districts. Indeed, as Table B5 and Table B6 in the appendix reveal, making statewide comparisons suggests that the value-added of charter schools has improved significantly since 2019.
[5] Reassuringly, the estimated effects of attending charter schools (as opposed to nearby traditional public schools) for 2018–19 are similar to those from my 2020 Fordham Institute report, which employs student-level data and, for middle schools, a rigorous “difference in differences” research design that yields plausibly causal estimates of charter school impacts. This correspondence speaks to the validity of SAS’s value-added calculations, which employ all prior years of available student test scores to generate a value-added score.
[6] The analytic sample includes “site-based” brick-and-mortar charter schools serving “general” or “special” education students. The analysis excludes virtual schools and brick-and-mortar schools focused on dropout prevention and recovery.
[7] Although composite estimates include value-added estimates for Integrated Mathematics I, Integrated Mathematics II, and ELA1, there are very few observations for these exams and they have relatively little bearing on the outcomes. The 2019 estimates capture the value-added estimates for schools in operation during the 2018–19 school year, and averages those schools’ value-added estimates across 2016–17, 2017–18, and 2018–19. This is the “three-year” value-added estimate that Ohio makes publicly available.
[8] The composite value-added measures are scaled the same way for pre-pandemic years (2018–19) and post-pandemic years (2021–22 and 2022–23), which enables straightforward pre- and post-pandemic comparisons. Figure 1 reveals a decline of 0.023 of a standard deviation in ELA achievement growth (over a 50 percent drop), but there is no decline in charter schools’ effectiveness (relative to nearby district schools) in mathematics or science.
[9] The estimate reaches statistical significance at the p=0.05 level for a one-tailed test but not a two-tailed test, which is the stricter threshold used in this report. However, given that the purpose of this analysis is to determine whether charter schools continue to outperform nearby district schools, then a one-tailed test is arguably the appropriate threshold to apply.
[10] These results appear in Table B4 of Appendix B. Note that the effect sizes for ELA and math are approximately twice as large as those reported in Figure 1, as the “effect sizes” on Ohio’s school report cards are scaled by the distribution of test-score growth, as opposed to the distribution of test scores. Scaling these estimates by the student achievement distribution reveals an overall charter school advantage of approximately 0.017 student-level standard deviations across all subjects combined, based on Ohio’s overall “composite” value-added measure.
[11]Hill et al. (2007) find that students typically experience annual achievement gains of 0.286 of a standard deviation in reading and 0.369 of a standard deviation in math in grades 4-10. Dividing the estimates in Figure 1 by these typical growth rates yields the fraction of a school year, which one can multiply by 180 (the typical number of instructional days in a school year) to get annual days’ worth of additional learning.
[12] Unlike the two-year composite estimates on Ohio’s school report cards—which pool these years such that 2022–23 gets twice as much weight as 2021–22 estimates—the analyses in Figure 2 and Figure 3 weight 2021–22 and 2022–23 estimates equally. Putting more weight on 2022-23 increases estimated charter school effectiveness relative to district schools.
[13] In the interest of space, the figure omits results for exams that few students took (ELA 1, Integrated Mathematics I, and Integrated Mathematics II) but that are accounted for in the composite estimates.
[14] For example, students who took the ELA2 high school exam in grade 10 during the 2018– 19 school year posted scores that were 0.09 of a standard deviation greater than we would have predicted based on their performance as of grade 8. In this case, the (statistically insignificant) impact estimate of 0.09 for ELA2 roughly translates to a value-added estimate of 0.045 standard deviations per year (learning in grades 9 and 10). This is comparable to the annual estimated ELA effect for grades 4–8 in 2018–19 of 0.033. In other words, the results in Figures 2 and 3 indicate that as of the 2018–19 school year, the achievement gains in ELA relative to district students were similarly large in grades 4–8 and high school, even though the estimates in Figure 3 make the high school estimates appear much larger.
[15] Although the by-year estimates yield primarily statistically insignificant results, the estimated differences between years sometimes attain or nearly attain conventional levels of statistical significance.
[16] It is not significant at the p<0.05 level for a two-tailed test, but it attains statistical significance for a one-tailed test.
[17] Within-district comparisons are particularly important in the wake of the pandemic, as recovery from learning loss varies significantly across districts. As Table B5 and Table B6 in the appendix reveal, making statewide comparisons suggests that the value-added of charter schools has improved significantly since 2019. However, that might be due to the intensive efforts to address learning loss in districts where charter students attend school.
Over the last few years, Ohio leaders have focused on improving education-to-workforce pathways through a variety ofinitiatives andfunding. Thus far, there’s been plenty of commendable progress. But significant challenges remain.
For ideas on how to overcome these challenges, state leaders should consider the State Opportunity Index that was recently released by the Strada Education Foundation, a national organization that works to strengthen the link between post-secondary education and opportunity. The index was designed to provide states with a quantifiable set of indicators they can use to assess how well they’re leveraging post-high school education—not just degrees, but certificates and other credentials, too. It establishes a baseline for states in five priority areas: clear outcomes, quality coaching, affordability, work-based learning, and employer alignment. States’ progress within each of these areas is categorized as leading, advanced, developing, or foundational.
How did Ohio fare on the index’s ratings? Not well. The state earned its highest rating—advanced—in the clear outcomes category. The work-based learning and employer alignment categories were rated as developing. Meanwhile, the quality coaching and affordability measures ranked the worst, with bottom-of-the barrel ratings of foundational.
To be clear, most of the data used by the index focus on college graduates. But there are useful insights for Ohio’s K–12 sector. In fact, there are some especially useful data points in the clear outcomes area, where Strada identified ten key education-to-employment data system elements, evaluated states’ progress on each element based on survey responses and an extensive review of publicly available information, and then assigned states a rating for each element, as well as an overall rating. In terms of individual ratings for each element, Ohio did particularly well—earning the highest available score of leading—in three areas: partnerships for outcomes data outside the state, interactive resources, and researcher access. But the state also earned low- or bottom-level ratings on three other elements. Here’s a look at each, and what Ohio can do to improve.
1. Integrating high school completion and employment data
This element evaluates how well states integrate and deliver information on learners’ earnings and employment after high school completion, as well as over time. The majority of states (twenty-nine) fell into the bottom two rating categories. Ohio was one of six to earn a rating of developing, indicating that the state is currently in the process of implementing information integration efforts related to earnings and employment after high school. The index notes that these implementation efforts are being done through the Coleridge Initiative, a nonprofit that works with government entities to ensure that data are used effectively in public decision-making. The Ohio Department of Higher Education and the Ohio Education Research Center are working together to expand the Multi-State Post-secondary Dashboard (which is powered by the Kentucky Center for Statistics) to track high school and non-degree employment outcomes in Ohio.
Although Ohio earned a developing rating in this area, Ohioans have cause for optimism. State leaders and agencies are already working on tracking these data. Going forward, leaders should focus on accomplishing two goals. First, these outcomes data should be published and easily accessible to the public, as well as disaggregated by demographics to ensure transparency. Second, state and local leaders should consistently use these data to drive policy decisions.
2. Open data files
This element determines whether states provide comprehensive and timely open data files containing “anonymized education-to-opportunity statistics” that anyone can access, download, and use. The index notes that these offerings should include downloadable databases, clear explanations of the resources, and data dictionaries. Public databases that contain only aggregate information—meaning individual learners can’t be identified from the data—would allow researchers, policy organizations, and members of the public to easily access information for analysis and reporting.
Twenty-one states earned a foundational rating on this element, either because their open data files contain only enrollment and completion metrics and no employment outcomes, or because they have no open data that Strada could identify. Ohio was one of the twenty-one, with the index noting that “no evidence was identified.” To improve, Ohio should look to the seven states that earned the index’s top rating. Colorado, for example, offers customizable open data files that contain a variety of disaggregated education-to-opportunity statistics. Kentucky, meanwhile, offers open data files containing disaggregated education-to-employment outcomes by program for public four-year universities, community colleges, non-degree credentials, and high school. It also provides files on apprenticeships, adult education, and career and technical education.
3. Dedicated insights capacity
This element evaluates whether states have designated a unit with responsibility and dedicated full-time capacity for generating education-to-employment insights. To earn the top rating, states must have a unit that meets four criteria:
It is a centralized, authoritative source designated by the state for education-to-employment insights;
It has publicly available resources, reports, or tools available for stakeholders;
It has dedicated staff; and
There is evidence of partnerships with higher education, workforce development, and economic development.
Only seven states—Arkansas, Colorado, Kentucky, Maryland, Mississippi, Nebraska, and Virginia—met all four of these criteria. The index found that Ohio, which was assigned the lowest rating of foundational, had “no evidence” of demonstrating this element.
When it comes to education-to-career pathways, Ohio leaders would be wise to remember the adage that Rome wasn’t built in a day. The Buckeye State has made considerable progress over the last few years. But there’s still plenty of work to be done. Some areas, like integrating high school completion and employment data, only need to be shepherded across the finish line. Others—like providing open data files and designated insights capacity—will require considerably more effort and investment. If state and local leaders are committed to continuing Ohio’s growth, Strada’s State Opportunity Index is worth a close look.