Today, the National Alliance for Public Charter Schools and the Thomas B. Fordham Institute honored Ohio Lt. Governor Jon Husted with the 2024 Champion for Charter Schools Award. This annual award honors bipartisan state policymakers who are making an exceptional difference for charter school families and communities in their state and who go above and beyond to shape students’ futures for the better.
“Educational freedom is an American aspiration. No school can serve the unique needs of every child, and no child should have the educational options limited by their zip code or financial circumstances,” said Lt. Governor Jon Husted. “I am honored to receive this award, and I will never stop fighting for families to have the ability to choose where they educate their children.”
“Lt. Governor Husted has championed charter schools for more than two decades,” said Chad L. Aldis, Vice President for Ohio Policy at the Thomas B. Fordham Institute. “He’s been an unwavering voice for empowering families with high-quality public-school options. During the last state budget, the DeWine/Husted administration led the way by championing a $3,000 per pupil funding stream for low-income students attending high quality public charter schools. This historic investment will further strengthen Ohio’s charter schools, so that they can effectively serve students for generations to come.”
“It is imperative for lawmakers at all levels of government to support policies to create and sustain high-quality charter schools if they want to make a difference for students and the future of our country. The National Alliance is grateful for the ways Lt. Governor Husted and this year’s other charter champions have been responsive to the students and families they serve by promoting more public school options, robust flexibility and accountability for those options, and increased funding and resources for these unique public schools. Without Lt. Governor Husted’s support, Ohio’s charter school movement would not be where it is today. He has fought for our students and deserves our gratitude and praise,” said Todd Ziebarth, Senior Vice President for Advocacy and State Support at the National Alliance for Public Charter Schools.
For too long, Ohio underfunded its public charter schools. That policy was unfair to charter school students—many economically disadvantaged—whose educations received less taxpayer support simply by virtue of their choice of schools. It was also unfair to charter schools, which were required to serve children on fewer dollars than the districts nearby.
Thanks to the leadership of Governor DeWine, Lieutenant Governor Husted, and the General Assembly, Ohio has recently made significant strides in narrowing the charter funding gap. One of the most critical initiatives is the Quality Community School Support Fund. Since FY 2020, this program has provided supplemental aid to quality charter schools—currently, in the amount of $3,000 per economically disadvantaged pupil ($2,250 per non-disadvantaged).
Our latest report is an evaluation of the high-quality charter funding program. It finds positive results: The additional dollars have allowed charters to boost their teachers’ salaries, reduced staffing turnover, and driven student learning gains.
Under the bold leadership of Governor Mike DeWine and Lieutenant Governor Jon Husted, Ohio lawmakers enacted the Quality Community Schools Support Fund (QCSSF) in 2019. The program—the first of its kind in the nation—provides additional state dollars to support high-performing public charter schools (also known as “community schools” in Ohio). From FY20 to FY23, between 20 and 40 percent of Ohio’s 320-some charters met the state’s performance criteria and received an extra $980 to $1,600 per pupil, depending on the year. Though it’s too early to study its impacts, lawmakers in last year’s state budget further increased program funding starting in FY24.
QCSSF helps address longstanding funding gaps faced by Ohio charters, which have historically received about 30 percent less taxpayer support than nearby districts. This shortfall has exposed charters to poaching by better-funded districts that can attract teachers via superior pay. It has also limited charters’ capacity to provide extra supports for students, most of whom are economically disadvantaged and could use supplemental services such as tutoring. The gap has also required charters—even high-performers—to operate on a shoestring, leaving them little room in their budgets for expansion. Besides limiting these schools in practical ways, underfunding charter students’ educations by virtue of their choice of public school is simply unfair.
We at the Thomas B. Fordham Institute have proudly advocated for QCSSF. To be sure, even with the most recent boost in funding, it still doesn’t quite achieve the ambition of full funding equity for all charter students. But it does represent a big step forward. Given our support for the program, it might be surprising to see our interest in also evaluating it. Why put it under the research microscope and risk dinging it?
For starters, while you’ll often find us backing choice-friendly initiatives, we also seek programs that work for students. To that end, Fordham has commissioned studies that shed light on policies we broadly support (evaluations of private school scholarships and inter-district open enrollment being two examples). Sometimes the results are sobering—and have led us to pursue course corrections—and at other times, they’ve provided encouragement to press onward. In sum, we remain committed to rigorous analysis that helps policymakers and the public understand how education initiatives function and how they impact students.
We are also mindful that initiatives lacking follow-up research and solid evaluation are more susceptible to the chopping block. Consider one example. Back in 2013, Ohio lawmakers launched a brand-new, $300 million “Straight A” fund that aimed to spur innovative practices. The program was initially greeted with enthusiasm but just five years later it had vanished. One likely factor is that no one got under the hood to study the program, which allowed skeptics to more easily cast doubt on its efficacy. Without evidence to guide their decision-making, policymakers may well have assumed it wasn’t working and chose to pull the plug whether or not it was actually boosting achievement.
With this in mind, we sought to investigate—as soon as practically possible—whether QCSSF is achieving its aims. Conducted by Fordham senior research fellow and Ohio State University professor Stéphane Lavertu, this analysis examines the program’s impacts on qualifying charter schools’ staffing inputs and academic outcomes. The study focuses on data from 2021–22 and 2022–23—years three and four of program implementation—as the prior two years were disrupted by the pandemic. To provide the clearest possible look at the causal impacts of the extra dollars provided by QCSSF, Dr. Lavertu relies on a rigorous “regression discontinuity” statistical method that compares charters that narrowly qualified for QCSSF to charters that just missed meeting the performance-based criteria.
We learn two main things about the program:
First, charter schools spent the supplemental funds in the classroom, most notably to boost teacher pay. As noted earlier, teacher salaries in charter schools have historically lagged. However, with the additional QCSSF dollars, qualifying charters were able to raise teacher pay by an impressive $8,276 per year on average. This allowed schools to retain more of their instructional staff, as indicated by a reduction in the number of first-year teachers as a percentage of their overall teaching staffs.
Second, students attending qualifying schools made greater academic progress in math and reading than their counterparts attending non-QCSSF charters. Based on an analysis of the state’s value-added scores—a measure of pupil academic growth on state assessments—Dr. Lavertu’s most conservative estimates indicate that the supplemental dollars led to additional annual learning that is equivalent to twelve and fourteen extra days in math and reading, respectively. In addition to these achievement effects, he also finds that QCSSF reduced chronic absenteeism by 5.5 percentage points.
Various factors could have driven the academic results, including the ability to implement high-quality curricula or tutoring programs with the QCSSF dollars. But we suspect a particularly strong connection between the reduced teacher turnover and the positive outcomes. Provided that turnover is not caused by efforts to remove low-performing instructors, studiesindicate that high levels of teacher turnover—something Ohio charters have struggled with because of their lower funding levels—hurts pupil achievement. The QCSSF dollars, however, have allowed charters to pay teachers more competitive wages and retain their talented instructional staff—thus helping to improve achievement.
Leaders of high-performing charters (including those sponsored by our sister organization, the Fordham Foundation) agree that the additional resources have proven pivotal. Andrew Boy, who leads the United Schools Network in Columbus, said, “Since the addition of the QCSSF, we’ve been able to vastly improve teacher pay to attract and retain more effective, experienced staff, which allowed us to serve more students and deliver on our mission.” Dave Taylor, superintendent of Dayton Early College Academy (DECA), said, “We have used these funds to address key strategic areas of focus: increasing teacher compensation, ensuring we have high quality instructional materials in all classrooms, and taking control of our students’ transportation. The QCSSF has truly been instrumental in DECA’s ability not only to weather the challenges of the pandemic but to be well positioned to serve students well long into the future.”
The findings from this report should encourage Ohio policymakers to keep pressing for improved charter funding. They should also ease concerns voiced by skeptics that charters would not put these additional dollars to good use (“obscene profits on the backs of students is the charter standard,” one bombastic critic said last year). Au contraire: The charters that qualified for the QCSSF program used the funds primarily to boost teacher pay, and the result was that students benefited. Isn’t that something we can all cheer?
Fordham remains committed to fair funding for charter students and to shining light on the QCSSF program. Indeed, we intend to come back in the next few years with updated evidence on the program. But for now, one can confidently say that the program has so far worked as intended and driven improvement in Ohio charter schools.
Executive Summary
Governor DeWine and the Ohio General Assembly established the Quality Community School Support Fund (QCSSF) in summer 2019. The program has since provided supplemental per-pupil funding to brick-and-mortar charter schools (called “community schools” in Ohio) deemed to be high quality and that primarily serve economically disadvantaged students. Most participating schools qualify based on a requirement that their students (1) have higher test scores than other students in the same district and (2) demonstrate greater test-score growth than the average student in the state. During the 2021–22 and 2022–23 school years, 43 percent of brick-and-mortar charter schools qualified for and received an average payment of nearly $1,500 per pupil in supplemental funding.[1] The purpose of this study is to estimate the impact of these payments on eligible charter schools and their students.
The analysis employs a statistical method that enables one to assess the causal impact of public policies. Specifically, it employs a “regression discontinuity” (RD) design that compares the outcomes of schools that narrowly met QCSSF program requirements to the outcomes of schools that narrowly failed to meet those requirements. The RD design’s assumption, which this analysis tests, is that schools near the performance cutoff were nearly identical except that some went on to receive extra funding and others did not. The study’s reliance on school-level outcome data available for only two post-program years (2021–22 and 2022–23) limits its scope and the precision of the impact estimates. The RD method’s focus on charter schools near the program’s cutoff (schools with approximately average achievement among charters) also means the results may not generalize to all charter schools. Nevertheless, this study’s estimates are plausibly causal and the best evidence available of the QCSSF program’s effects.
The analysis indicates that charter schools receiving a QCSSF award in 2021–22 and 2022–23 spent the extra funds on classroom activities—notably on teachers—as opposed to administration or other nonclassroom functions such as transportation or food service. The main estimates indicate that schools receiving the extra funds had higher teacher salaries (by over $8,000), lower teacher turnover (by approximately 25 percent), and nearly four fewer students per teacher than they would have had without the extra funds. These budgetary and staffing impacts, in turn, correspond to more student learning in English language arts (ELA) and mathematics (lower-bound estimates indicate an additional two to three weeks’ worth of additional learning each year, though the main estimates are far larger) and lower rates of chronic absenteeism (by five to seven percentage points) relative to charter schools that narrowly missed qualifying for the program. Put differently, the results indicate that the funding enabled qualifying charter schools to substantially mitigate pandemic-era achievement and attendance declines among their primarily low-income students, as schools receiving QCSSF funding moved from approximately the fiftieth to the sixtieth percentile in the charter-school test-score and attendance-rate distributions.
This study’s estimates are imprecise and capture rough approximations. The strength of the research design, however, makes it reasonable to conclude that QCSSF funding led to increases in teacher pay—stabilizing charter schools’ teaching staff—and mitigated steep declines in student achievement and attendance. Indeed, this study’s most conservative estimates indicate that the returns of every dollar spent (as measured by student achievement growth) are justifiable, even setting aside benefits that might accrue from improved attendance and other behavioral outcomes. Generating precise estimates of spending efficiency requires more data, but this study indicates that, as of the 2022–23 school year, the QCSSF was likely an efficient use of state funds.
Quality Community School Support Fund
Governor Mike DeWine proposed the Quality Community School Support Fund (QCSSF) in the spring of 2019 as a part of his first state budget, and the Ohio General Assembly enacted the program later that summer. Under this initiative, qualifying brick-and-mortar (“site-based”) charter schools receive supplemental per-pupil funding if they meet the program’s academic performance standards. Between the 2019–20 and 2022–23 school years, which are the focus of this study, schools received up to $1,750 per economically disadvantaged student and up to $1,000 per noneconomically disadvantaged student, depending on the availability of funds. They were guaranteed at least three years of funding from the time they qualified, though subsequent Covid-related legislation provided schools funding through 2022–23 (a fourth year) if they first qualified for the award in 2019–20.[2] When the pandemic ended, additional schools became eligible for funding in 2022–23 based primarily on their academic performance in 2018–19 and 2021–22. QCSSF awards increased substantially starting in 2023–24, but most school data for that year are not yet available.
Table 1. Counts of schools that qualified[3] for QCSSF
Table 1 presents a breakdown of schools that received QCSSF awards. As the table indicates, the share of brick-and-mortar charter schools receiving QCSSF funding increased from 25 percent in 2019–20 (61 of 240 eligible schools) to 45 percent in 2022–23 (113 of 251 eligible schools). During that span, approximately 75 percent of schools initially qualified based on Criterion 1. This criterion requires that at least 50 percent of their students are economically disadvantaged, that their sponsors[4] were rated “exemplary” or “effective” in their most recent state evaluation, and that schools meet the following academic targets: (1) a higher “performance index” score than the district in which they are located, based on their two most recent report cards, and (2) a value-added rating of four or five stars (A or B in earlier years) based on their most recent report card.
New startup schools lacking a performance history also qualified if their sponsors were rated “effective” or “exemplary” and they were replicating a school model deemed effective (Criterion 2) or if they had received a grant from the federal Charter Schools Program or their operator had demonstrated effectiveness in another state (Criterion 3). Approximately 25 percent of schools initially qualified based on these criteria. Finally, House Bill 583 of the 134th General Assembly—in response to the pandemic—extended a fourth year of funding (in 2022–23 only) for schools that qualified for funding in 2019–20 and did not meet the academic criteria in subsequent years.
Schools that qualify for funding primarily serve students from low-income households in urban districts. As Table 2 indicates, as of 2022–23, the average charter school that received a QCSSF award had 340 students, 94 percent of whom were economically disadvantaged. Sixteen percent of students were also identified as having a disability (14 percent if one omits schools that primarily serve students with disabilities), and 11 percent were identified as English learners.
Table 2. Characteristics of schools that qualified for QCSSF awards
QCSSF awards depend on the availability of funds, which is based on the program appropriation in the state budget and the enrollments of schools that qualify. As Table 3 (below) reveals, average payments ranged from $985 per pupil for 2020–21 to $1,598 for 2021–22. Although inflation has negatively impacted the buying power of these payments, they still constitute a substantial share of awardees’ budgets. For example, the average payment of $1,375 in 2023 accounted for approximately 9 percent of the $15,635 in spending per pupil in the average school receiving an award (a total that includes temporary federal pandemic relief funds, nonoperating expenditures, and large supplemental funding in two special education schools that serve autistic children). That is down from 13 percent of total spending in 2019–20 but still accounts for a large share of total spending.
Table 3. School expenditures and QCSSF payments adjusted for inflation
Research on the impact of public school expenditures indicates that approximately $1,500 in supplemental funding should, on average, lead to test-score increases of approximately 0.05 of a standard deviation after four years.[5] Although that is a modest average effect, effect sizes vary significantly and depend on the interventions funded. Brick-and-mortar charter schools have been an unusually efficient educational option for Ohio students, and they primarily serve low-income students who stand to gain disproportionately from attending a high-quality school—particularly in the wake of the pandemic. Consequently, one might expect $1,500 in QCSSF funding to have larger achievement benefits than the modest 0.05 standard deviations found in the average study.
Although such increases in annual spending could have noteworthy effects on charter students’ academic performance, how the money is disbursed may prevent charter schools from fully realizing such gains. One reason is that awards are made during the latter half of the school year. For example, awards for the 2022–23 school year were made at the start of the 2023 calendar year. Thus, charters may not be able to make the best use of the money until their second or third years of funding. Additionally, charters receiving the award across all years, from 2019–20 through 2022–23, may have come to count on it. If they anticipated losing the award in 2023–24 due to lackluster performance or a reduction in the share of students who are economically disadvantaged, they might have set the extra money aside instead of spending it immediately. These possibilities could limit the impact of QCSSF awards as well as this study’s ability to detect that impact.
Study Design and Validity
This study evaluates the impact of QCSSF awards on the effectiveness of Ohio charter schools during the 2021–22 and 2022–23 school years—the first post-pandemic years for which necessary test-based outcome measures are available.[6] Specifically, the analysis estimates the causal impact of QCSSF awards on charter schools’ spending, staffing, and academic performance using publicly available data from the Ohio Department of Education and Workforce (ODEW) website.[7] There are two primary challenges with estimating the causal impact of the program. First, by definition, the typical charter school receiving the award is different than the typical charter school not receiving it, making simple comparisons of their performance problematic. Second, using school-level data to examine changes in performance might make it appear as if QCSSF schools became more effective when, in fact, receiving the award merely enabled them to attract higher achieving students. This study’s design addresses both concerns.
The key to estimating the QCSSF program’s causal impact is to take advantage of the fact that awards are largely made based on schools meeting a strict performance cutoff (Criterion 1 discussed above). Schools very close to that performance cutoff should be nearly identical except that, by random chance, some find themselves above the performance requirement while others find themselves below it. If this testable assumption holds, one can use a “regression discontinuity” (RD) design to compare changes in the administration and performance of schools that find themselves just above or below QCSSF’s performance requirement. This design is widely considered among the strongest evaluation designs, provided that its assumptions and data requirements are met.[8]
To implement an RD design in this context, I first limit the sample to eligible charter schools with student bodies that were at least 50 percent economically disadvantaged and sponsors that were rated “exemplary” or “effective” according to their latest evaluation. I then identify how close schools were to meeting the performance requirements to receive funds in 2021–22 and 2022–23 based on Criterion 1.[9] Focusing on schools close to the performance threshold, I compare changes in their expenditures, staffing, and student outcomes (from 2018–19, just before the first award, to 2021–22 and 2022–23) between schools that did and did not qualify for the award based on Criterion 1.[10] Importantly, schools that narrowly qualified for an award were similar to those that narrowly failed to qualify as of 2018–19, just before the establishment of the QCSSF program.[11] Schools that narrowly failed to qualify for an award (based on their performance in 2018–19 or earlier) also were not statistically more likely to close prior to the 2022–23 school year than those that received the award.[12] These validity tests indicate that the RD design should provide credible estimates of the causal impact of awarding QCSSF funding to charter schools near the performance cutoff.
Because the analysis employs school-level data, it is important to also assess whether estimated effects are due to more learning or changes in student composition. Fortunately, qualifying for QCSSF funding appears to have no effect on the characteristics of schools’ students, such as total enrollments or the percent who are economically disadvantaged, have a documented disability, or have limited English proficiency.[13] Correspondingly, as the analysis below reveals, controlling for a school’s student characteristics does not change the estimates—it merely makes those estimates more precise (i.e., more statistically significant), which is what one would expect if the estimated impact of QCSSF funding is due to student-level improvements in achievement and attendance as opposed to changes in schools’ student composition. Moreover, the primary outcome in the analysis is school-level “value added” to student achievement, which is meant to isolate the impact of schools on student learning—holding constant students’ test scores from prior years. This measure should not be affected by school composition changes.[14] Unfortunately, no such value-added measures are available for school-level rates of absenteeism, and there is no way to estimate accumulated student-level test-score gains from 2018–19 to 2022–23 (across all four years of the QCSSF program). The value-added measures can only capture annual achievement growth that occurred during the 2021–22 and 2022–23 school years, respectively.[15]
The analysis provides further confidence in the RD design because the estimated impacts are generally insensitive to narrowing or expanding the sample of schools used in the analysis. Specifically, the estimated effects of QCSSF are similar in magnitude whether one limits the sample to schools within 0.3 standard deviations of the performance threshold (including only schools very close to the threshold) or expands the sample to schools within two standard deviations of the threshold (a bandwidth which includes all but one of the schools that qualify for a QCSSF award).[16] Similarly, the estimated effects are generally insensitive to the inclusion of statistical controls.[17] To maximize statistical power, however, this analysis focuses on statistical models that (1) control for the district in which schools are located; (2) control for up to three baseline values (from 2016–17, 2017–18, and 2018–19, respectively) of the applicable spending, staffing, or student outcome variable; and (3) employ large bandwidths (e.g., the results below are based on samples within 1.5 standard deviations of the QCSSF performance cutoff).
The inclusion of pre-QCSSF control variables in the statistical models implies that the analysis is essentially comparing changes in school inputs and outcomes since 2018–19 between qualifying and nonqualifying charter schools located in the same school district. The use of relatively large bandwidths means that a large proportion of Ohio’s brick-and-mortar charter schools are included in the estimation. It is important to keep in mind that, regardless of the size of the bandwidth used in the analysis, the RD design always involves estimating the impact of the program by comparing schools very close to the QCSSF performance cutoff. In other words, regardless of the number of schools included in the analysis, the procedure involves estimating differences between schools that narrowly qualified to those that narrowly failed to qualify. That the various bandwidths yield comparable estimated effects indicates that this is indeed the case.
Impact of QCSSF on Charter School Spending and Staffing
The following figures and tables report the estimated difference in inputs and outcomes between schools that narrowly qualified for QCSSF funding and those that narrowly failed to qualify. Specifically, they report the estimated impact based on models that employ a bandwidth of 1.5 standard deviations, as this bandwidth captures a midpoint among the three preferred bandwidths used in the analysis and the estimated coefficients based on this sample capture approximately the average effect sizes among the three samples.[18] First, Figure 1 (below) presents the estimated impact of receiving QCSSF funding in 2021–22 and 2022–23 on schools’ operating expenditures per pupil in those two years. Positive (negative) numbers indicate that schools that qualified spent more (less) per pupil during the 2021–22 and 2022–23 school years (all in 2023 dollars) than schools that failed to qualify. A solid bar indicates that an estimate attains statistical significance at the p<0.05 level for a one-tailed test.[19]
Figure 1 indicates that a QCSSF award leads to an increase in classroom expenditures of approximately $1,346 per pupil. This category of expenditures includes instruction (activities involving the interaction of students and teachers or instructional aides) and pupil support services. The estimated impact on nonclassroom operational expenditures—such as school administration, transportation, and food service—is negative but does not approach statistical significance. Although the estimates are far too imprecise to rule out meaningful positive or negative changes in nonclassroom spending, the results clearly indicate that schools receiving QCSSF payments spent the money primarily on classroom activities. The estimates of classroom spending are also imprecise—indeed, the only claim one can make with confidence is that schools increased classroom spending in response to QCSSF awards. However, the close correspondence between the estimate in Figure 1 and actual payment amounts (reported in Table 2) provides some confidence that the analysis yields informative point estimates.
Figure 1. Impact of QCSSF award on charter schools’ spending per pupil
Classroom expenditures primarily go toward teacher-based instructional activities, so the next logical question is how this increase in funding affected teachers. Table 4 provides estimated impacts on teacher counts, salaries, and mobility (first column) and, for context, average values of those variables as of the 2022–23 school year (second column).[20] The results in the first row suggest an increase of approximately 2.5 teachers, on average, among schools that received a QCSSF award when compared to those that did not. Although that result is not statistically significant, that may have more to do with the imprecision of the statistical estimates. Indeed, the more precise estimate of QCSSF’s impact on student-teacher ratios attains statistical significance, hence the bolded text and star next to the estimate. Specifically, the result indicates that schools that received the funding had nearly four fewer students per teacher than those that did not. Because there were no significant enrollment declines in these schools, it appears this decline in student-teacher ratios is indeed due to QCSSF schools having more teachers.[21] Once again, the estimates are imprecise, but it is reasonable to conclude that QCSSF funding led to smaller student-teacher ratios and that this was likely because QCSSF-funded schools had more teachers as opposed to fewer students.
Table 4. Impact of QCSSF award on charter schools’ teachers
Charter operators have pointed to teacher turnover as one of the big challenges that charter schools face. Table 4 confirms this claim and suggests that receiving QCSSF funding went a long way toward addressing the problem. In the average charter school in the sample, approximately 43 percent of their teachers were new to the school during the 2022–23 school year. That means that, roughly, just under half of teachers turn over every year in charter schools.[22] However, the results indicate that the proportion of teachers new to QCSSF schools was 12.48 percentage points lower than it would have been in the absence of the program. That is a reduction in turnover of approximately 25 percent. Again, the models are imprecise, but the results provide clear evidence that teacher turnover declined.
How were charter schools with a QCSSF award better able to retain teachers? Table 4 indicates higher salaries might explain it. Schools that narrowly qualified for the award paid teachers approximately $8,000 more than schools that narrowly failed to qualify. Given the average charter school teacher salary during the 2022–23 school year was $42,963, the estimated salary increase associated with a QCSSF award is substantial. Table 4 also indicates that the extra classroom spending did not yield more experienced teachers or more teachers with a master’s degree. Salary increases may have enabled schools to retain more teachers, thereby enhancing the continuity of their teaching force from year to year. Because of tight labor markets in the wake of the pandemic, which made it difficult to recruit new teachers, the ability to retain teachers might also have contributed to smaller student-teacher ratios. If smaller student-teacher ratios had primarily come from hiring new teachers, then the percent of teachers in their first year would probably not have declined among schools receiving QCSSF funding.
The estimates are too imprecise to put too much weight on the exact numbers presented in Table 4. The models do not enable one to rule out effect sizes considerably below or above those estimates. What one can conclude with a reasonable level of certainty, however, is that QCSSF funding led to higher teacher salaries and lower teacher turnover.
Impact of QCSSF on Student Learning
Did higher teacher salaries, and a corresponding reduction in teacher turnover and student-teacher ratios, translate to improved student learning? To examine this question, the present analysis focuses primarily on the “value-added” measure of student achievement growth that Ohio makes available on school report cards. It enables one to assess whether student test scores are higher in schools that received additional funding, while holding constant students’ test scores in prior years. Also recall that, as of the 2018–19 school year, there were no differences in value-added between schools that narrowly did and did not qualify for funding (a key feature of the RD design) and that the analysis effectively estimates changes in value-added between 2018–19 (before the program) and 2021–22 and 2022–23. Thus, differences in test scores in 2021–22 and 2022–23 can plausibly be attributed to the QCSSF award improving student learning during each of those years, as opposed to those awards simply enabling schools to recruit higher achieving students.
Figure 2 presents estimates of QCSSF awards’ impact on student learning by subject. Specifically, it presents value-added estimates for English language arts (combining results from Ohio state tests in grades 4–8 and the high school ELA I and ELA II exams), mathematics (combining state tests in grades 4–8 and high school Algebra I, geometry, Integrated Mathematics I, and Integrated Mathematics II), and science (combining state tests in grades 5 and 8 and the high school biology exam). Like the above estimates for spending and teachers, these “main estimates” average schools’ annual value-added (i.e., the amount of estimated learning for a given school year) for 2021–22 and 2022–23.[23]
Figure 2. Impact of QCSSF award on charter schools’ test-score value-added (all grades)
Figure 2 indicates that student learning in ELA and math was substantially greater in schools that narrowly attained the performance threshold for a QCSSF award, as compared to those that narrowly missed that threshold. The main estimates indicate that students in schools that received extra funding—and, consequently, had lower teacher turnover and student-teacher ratios—had test scores that were 0.07–0.10 of a standard deviation higher on their spring ELA and math tests (after controlling for their test scores in prior years) than students in schools that did not receive extra funding. These are substantial effects in ELA and math, though there is no significant impact in science.
Figure 2 also reports the most conservative estimates implied by the statistical models, as the amount of additional learning associated with QCSSF receipt is an important policy question. Specifically, it presents the lower-bound estimate from the statistical confidence interval around the main estimate. The results indicate that one can rule out, with a significant degree of statistical confidence, effects below 0.01 of a standard deviation for ELA and 0.02 of a standard deviation for math. Estimating a model using the average of ELA and math (which produces more precise results and, thus, a narrower confidence interval) yields a lower-bound estimate of 0.195 standard deviations. There is no lower-bound estimate for science because, as Figure 2 indicates with the empty bar, the estimated impact is not statistically different from 0.
Figure 3 focuses on annual test-score value-added in grades 4–8 in ELA and math (thus excluding science and high school tests), as those estimates can be directly translated into a more intuitive metric: annual days’ worth of additional learning.[24] This metric is controversial and presents several potential problems.[25] Nevertheless, it provides at least some intuition for the effect sizes at hand, and it is commonly used in widely available reports of charter school effectiveness. Specifically, taking account of how much students typically learn every year (more in math than ELA) and assuming 180 days of instruction in the typical school year, the impact on test scores of increasing classroom expenditures by approximately $1,300 per pupil is the equivalent of students getting forty-four to forty-seven additional days of learning each year (2021–22 and 2022–23). If these annual gains accumulate as students progress from grades 4–8, then students who attended charter schools with extra funding for all of those years would have experienced over one full school year’s worth of additional learning by the time they get to high school.
Given the imprecision of the statistical models and the controversy surrounding “days of learning” conversions, however, using the lower-bound estimates may be more appropriate. Based on those estimates, those increases in classroom spending translate to twelve to fourteen days of additional learning—about 2.5 weeks of additional learning each year. In this case, accumulated gains through grade 8 would exceed over one third of a school year worth of learning. Concluding that achievement gains accumulate additively like this across years requires strong assumptions and is an issue addressed in the concluding discussion. What one can say based on these results, however, is that QCSSF awards had a large positive impact on student learning, even if one uses the most conservative estimates implied by this study.
Figure 3. Impact of QCSSF on charter schools’ annual test-score growth (grades 4–8 only)
Another way to provide intuition for the achievement effects is to examine changes in schools’ performance index scores. The performance index does not control for changes in student composition like the value-added measure, but it could capture improvements in school quality if the characteristics of their students change little over time. Figure 4 (below) presents QCSSF impact estimates using the percentage of total possible points (0–100) a school received on the performance index.[26] The bar on the left indicates that schools that narrowly attained a QCSSF award had performance index scores 4.22 percentage points higher in 2021–22 and 2022–23 than those that narrowly missed receiving the award. This estimate does not quite attain conventional levels of statistical significance, but it corresponds to the value-added estimates above.[27] Importantly, the figure also reveals a similar estimate of 4.68 percentage points—one that does attain statistical significance—when the model controls for school enrollments and the characteristics of students in a school (percent economically disadvantaged, disabled, or limited English proficiency). That the inclusion of these controls increases statistical significance without meaningfully affecting the impact estimate suggests the performance index estimates indeed capture improvements in student learning.
How big are these effect sizes? The average brick-and-mortar charter school, like the average school that narrowly failed to qualify for QCSSF, had an average value of approximately 50 percent on the performance index in 2022–23. A relative improvement of approximately 4.5 percentage points (going from 50 percent to 54.5 percent) is equal to one third of the standard deviation in the performance index scores of Ohio charter schools. That is the equivalent of going from approximately the fiftieth percentile to the sixty-second percentile in terms of charter school educational quality.
Figure 4. Impact of QCSSF awards on charter schools’ average achievement
Note that these results, like those in the rest of the analysis, capture average outcomes across the 2021–22 and 2022–23 school years. Total accumulated gains of the QCSSF program—which has funded schools as far back as 2019–20—are captured by the 2022–23 estimate because it is not weighed down by achievement in prior years, when achievement was still ramping up.[28] The 2022–23 impact estimate for the performance index is approximately seven percentage points or about half of the standard deviation in charter school performance. These results imply that, over the life of the QCSSF program since 2019–20, charter schools that narrowly qualified for supplemental funding improved relative to schools that narrowly failed to qualify such that they went from the fiftieth percentile to the sixty-ninth percentile in the charter school achievement distribution. In absolute terms, achievement in these schools indeed went down somewhat between 2018–19 and 2022–23, as the pandemic had an outsized impact on low-income students residing in urban areas.[29] What these results indicate is that QCSSF funding significantly limited the learning loss these schools otherwise would have experienced, leading them to climb the charter school performance distribution.
Impact of QCSSF on Student Attendance
Attendance is both an input that leads to learning and an important educational outcome, as attendance rates capture behavioral attributes that are themselves predictive of better lifetime outcomes.[30] The analysis below examines differences in school absence rates (the percent of total instructional hours a school’s students missed) as well as rates of chronic absenteeism, which capture the percent of a school’s students who missed at least 10 percent of instructional hours (approximately eighteen days, or 3.5 weeks). As with the performance index, these school-level measures do not account for differences between students. Thus, once again, one must compare estimates with and without controls for student enrollments and characteristics.
Figure 5 (below) indicates that, on average, schools that narrowly qualified for QCSSF had lower absence rates (by 1.63 percentage points) and lower rates of chronic absenteeism (by 5.48 percentage points). These estimates are merely suggestive as they are not statistically significant. However, after controlling for student characteristics, estimates increase in magnitude and attain statistical significance. Holding constant the demographic characteristics of students as of the 2022–23 school year, QCSSF schools’ absence rates are 2.06 percentage points lower and their rates of chronic absenteeism are 7.31 percentage points lower than they would have been in the absence of the program. The average attendance rate in charter schools was 86 percent during the 2021–22 and 2022–23 school years, and the average rate of chronic absenteeism was 52 percent. The impact on attendance is equal to approximately one-third of a standard deviation in the charter school distribution, such that QCSSF funds enabled schools to improve from the fiftieth to the sixty-second percentile of the school attendance-rate distribution. Once again, it is important to note that attendance rates in fact declined significantly during the pandemic, even among schools that received supplemental funding. But schools with QCSSF funding ultimately realized more modest attendance declines than those without it, leading them to climb the charter school distribution.[31]
Unlike the analysis of student learning, the analysis of attendance is limited entirely to school-level measures that do not control for students’ academic histories. Thus, the analysis of attendance is less conclusive than the analysis of student achievement. However, that these estimates line up with the estimates for school value-added and performance index scores provides further evidence that the estimates capture schools’ positive impacts on student attendance.
Figure 5. Impact of QCSSF awards on charter schools’ absence rates
Summary and Implications
The analysis indicates that charter schools receiving a QCSSF award in 2021–22 and 2022–23 had higher classroom spending (+$1,346 per pupil), higher teacher salaries (+$8,276), nearly four fewer students per teacher, and less teacher turnover (by 12.5 percentage points, approximately 25 percent) than they would have had without the award. These budgetary and staffing impacts, in turn, correspond to more student learning in ELA and math (lower-bound estimates indicate an additional two to three weeks’ worth of additional learning each year) and lower rates of chronic absenteeism (by five to seven percentage points) than schools would have had in the absence of the program. The estimates are imprecise, but the analysis provides convincing evidence that QCSSF has enabled qualifying charter schools to substantially mitigate pandemic-era learning and attendance declines experienced across Ohio’s urban schools serving low-income students.
The QCSSF program’s impact is substantial. The effects on attendance are in line with what we might expect from the significant reduction in student-teacher ratios we observe.[32] The main achievement estimates, however, are more pronounced. They imply that an additional $1,000 in spending per pupil (the typical benchmark used in spending studies) yields gains in ELA and math achievement of approximately 0.05 standard deviations annually in 2021–22 and 2022–23, which suggests a cumulative impact of 0.1 standard deviations across those two years.[33] The value-added estimates are primarily driven by a large effect in 2022–23, which could be because students are rebounding from pandemic learning losses. Although realized in a single year of testing, these gains could conceivably have been facilitated by extra funding schools received and better student attendance every year since the program’s enactment in 2019. Nevertheless, even if the gains from the program are fully concentrated in 2022–23—such that the estimates in this study capture cumulative effects from all four years of funding since 2019–20—cumulative gains of 0.1 of a standard deviation after four years of additional spending are substantial. The average estimate in quasiexperimental studies implies that an increase in spending of $1,000 per pupil leads to achievement increases of approximately 0.032 of a standard deviation after four years.[34] In other words, increasing funding for Ohio charter schools has yielded three times the typical learning gains that come from increasing public school expenditures.
One reason for these large estimates could be statistical noise. As noted throughout the analysis, the estimates are imprecise. If one focuses on the lower-bound estimates—in other words, the minimum effect sizes one would likely get with statistical confidence—annual achievement gains associated with approximately $1,346 in funding are approximately 0.02 of a standard deviation in ELA and math, for a total of 0.04 standard deviations by the end of 2022–23. That implies an effect size of approximately 0.03 for every $1,000 dollar spent per pupil. If the gains from the QCSSF program were indeed primarily realized in 2021–22 and 2022–23, as the analysis of performance index scores suggests, then the average effect of the program is approximately 0.03 of a standard deviation after four years. That is just about the average effect of 0.032 that research indicates one should expect. Although this effect is modest in size, it captures a lower-bound estimate. This study suggests QCSSF’s impact on student learning was likely far greater.
There are several possibilities for the high returns to spending in Ohio charter schools that narrowly qualified for QCSSF awards, which are broadly similar to the typical brick-and-mortar charter school in Ohio. First, charter schools face stronger incentives than traditional public schools to direct resources toward student learning. Parents might not enroll their kids and sponsors might not renew their contracts if charter schools yield inadequate academic outcomes. Indeed, Ohio charter schools have long been more efficient than traditional public schools, in that they have realized superior achievement and attendance outcomes with far less funding.[35] Second, charter schools had been underfunded relative to district schools in Ohio, so the marginal returns to extra funding may be higher. Third, charter schools primarily serve low-income, low-achieving students who were hit exceptionally hard by the pandemic.[36] Research shows that the returns to spending are larger for low-income students,[37] and funding charter schools has long been a relatively efficient way to direct resources to students who need it most.
If additional funding has had such a large impact on charter school performance, then how is it that Ohio charter schools’ performance advantage over traditional public schools declined since the 2018–19 school year?[38] Part of the answer is that achievement growth in charters that narrowly qualified for the extra funding increased relative to the average achievement growth in Ohio, whereas students lost ground if they were in charters that narrowly missed out on the award.[39] That charter schools without extra funding struggled so mightily relative to schools with QCSSF funding—in terms of staffing and student outcomes—may be due to the unusually tight labor market in the wake of the pandemic. Elementary and Secondary School Emergency Relief (ESSER) funds might have exacerbated this problem by enabling traditional public schools to go on a hiring binge,[40] which drove up salaries and perhaps made it even harder for most charter schools to attract and retain teachers than it was prior to the pandemic.[41] Whatever the precise cause, the results of this study suggest Ohio charter schools’ superior performance (relative to nearby district schools) could have been completely wiped out were it not for the QCSSF program making some charter schools more competitive in the teacher labor market.[42]
Increased student learning, as captured by test scores in ELA and math, should yield tangible benefits to Ohioans. Achievement on such tests is tied to better life outcomes for students and greater economic growth for states.[43] Back-of-the-envelope cost-benefit calculations based on the learning gains found in this study suggest that the economic returns exceed the costs of QCSSF through the 2022–23 school year.[44] There is also some emerging evidence that improved attendance rates and other behavioral outcomes can also yield their own down-stream benefits, beyond their contributions to student learning.[45] Overall, therefore, the results of this study suggest that QCSSF has been a worthwhile investment.
References
Abott, Carolyn, Vladimir Kogan, Stéphane Lavertu, and Zachary Peskowitz. “School district operational spending and student outcomes: Evidence from tax elections in seven states.” Journal of Public Economics 183 (2020): 104142, https://doi.org/10.1016/j.jpubeco.2020.104142.
Baird, Matthew D., and John F. Pane. “Translating standardized effects of education programs into more interpretable metrics.” Educational Researcher 48, no. 4 (2019): 217–228, https://doi.org/10.3102/0013189X19848729.
Calonico, Sebastian, Matias D. Cattaneo, and Rocio Titiunik. “Robust Nonparametric Confidence Intervals for Regression-Discontinuity Designs.” Econometrica 82, no. 6 (2014): 2295–326, https://doi.org/10.3982/ECTA11757.
Doty, Elena, Thomas J. Kane, Tyler Patterson, and Douglas O. Staiger. What do changes in state test scores imply for later life outcomes? (NBER Working Paper No. 30701). National Bureau of Economic Research. https://www.nber.org/papers/w30701
Goldhaber, Dan, Grace Falken, and Roddy Theobald. “ESSER Funding and School System Jobs: Evidence from Job Posting Data.” Working Paper No. 297-0424, Center for Analysis of Longitudinal Data in Education Research, Arlington, VA, 2024. https://caldercenter.org/sites/default/files/CALDER%20WP%20297-0424.pdf.
Hanushek, Eric A., Jens Ruhose, and Ludger Woessmann. “Knowledge Capital and Aggregate Income Differences: Development Accounting for U.S. States.” American Economic Journal: Macroeconomics 9, no. 4 (2017): 184–224. https://www.aeaweb.org/articles?id=10.1257/mac.20160255
Jackson, C. Kirabo. “What do test scores miss? The importance of teacher effects on non–test score outcomes.” Journal of Political Economy 126, no. 5 (2018): 2072–107. https://www.journals.uchicago.edu/doi/10.1086/699018
Jackson, C. Kirabo, and Claire L. Mackevicius. “What Impacts Can We Expect from School Spending Policy? Evidence from Evaluations in the United States.” American Economic Journal: Applied Economics 16, no. 1 (2024): 412–46. https://www.aeaweb.org/articles?id=10.1257/app.20220279
Jackson, C. Kirabo, Shanette C. Porter, John Q. Easton, Alyssa Blanchard, and Sebastián Kiguel. “School Effects on Socioemotional Development, School-Based Arrests, and Educational Attainment.” American Economic Review: Insights 2, no. 4 (2020): 491–508. https://www.aeaweb.org/articles?id=10.1257/aeri.20200029
Ohio Department of Education & American Institutes for Research. Ohio’s state tests in English language arts, mathematics, science, and social studies: 2018–2019 school year. Annual technical report, 2019.
Ohio Department of Education & Cambium Assessment, Inc. Ohio’s state tests in English language arts, mathematics, science, and social studies: 2021–2022 school year. Annual technical report, 2022.
Ohio Department of Education & Cambium Assessment, Inc. Ohio’s state tests in English language arts, mathematics, science, and social studies: 2022–2023 school year. Annual technical report, 2023.
Rose, Evan K., Jonathan T. Schellenberg, and Yotam Shem-Tov. “The effects of teacher quality on adult criminal justice contact.” Working Paper No. 30274, National Bureau of Economic Research, July 2022. https://www.nber.org/papers/w30274.
Tran, Long, and Seth Gershenson. “Experimental estimates of the student attendance production function.” Educational Evaluation and Policy Analysis 43, no. 2 (2021): 183–99. https://ftp.iza.org/dp11911.pdf
Vosters, Kelly N., Cassandra M. Guarino, and Jeffrey M. Wooldridge. “Understanding and evaluating the SAS® EVAAS® Univariate Response Model (URM) for measuring teacher effectiveness.” Economics of Education Review 66 (2018): 191–205. https://www.sciencedirect.com/science/article/abs/pii/S0272775718301341
Acknowledgments
I thank the Thomas B. Fordham Institute—particularly Chad Aldis, Aaron Churchill, Chester Finn, Amber Northern, and Mike Petrilli—for making this project possible and for offering their expertise and helpful feedback. I am also grateful to Cory Koedel for providing a thoughtful review and several helpful suggestions. Any remaining weaknesses in this study are entirely my fault.
- Stéphane Lavertu
Endnotes
[1] Brick-and-mortar charter schools focused on dropout prevention and recovery were not eligible for funding between 2019 and 2023 and are not included in these statistics.
[2] Schools can requalify for the program, and restart the three-year funding clock, if they meet the criteria again in subsequent years.
[3] Charter schools focused on dropout prevention and recovery are excluded from these totals because they were not eligible for funding from 2019–20 to 2022-23.
[4] In Ohio, charter authorizers—the entities that allow charter schools to open and to continue operating—are referred to as “sponsors.”
[5] See C. Kirabo Jackson and Claire L. Mackevicius, “What Impacts Can We Expect from School Spending Policy? Evidence from Evaluations in the United States,” American Economic Journal: Applied Economics 16, no. 1 (2024): 412–46.
[6] The value-added achievement estimates needed for this study were unavailable for 2019–20 and 2020–21 due to the suspension of testing in spring 2020. The analysis pools two years of post-pandemic data (2021–22 and 2022–23) because some of the variables—particularly school spending and value-added—are very noisy due to dramatic shifts in student achievement and school funding. Pooling these years is not ideal because QCSSF awards for 2022–23 are based in part on 2021–22 school performance. However, one must include QCSSF selection based on 2021–22 school performance so that the statistical models are sufficiently predictive of QCSSF receipt and generate sufficiently precise statistical estimates. As the analysis below reveals, limiting the outcome analysis to 2022–23 actually increases the magnitude of the positive effects. Thus, pooling outcomes from 2021–22 and 2022–23 yields more conservative estimates while providing much needed statistical power.
[7] The “community school” annual report data enables one to identify the brick-and-mortar (“site-based”) schools that serve “general” or “special” education populations, which are the focus of this analysis. They also allow one to identify the school district in which charters are located. The teacher data used to conduct the salary and mobility analysis are available on ODEW’s data report portal. All other data are available on the data download page for Ohio’s school report cards.
[8] RD is a quasiexperimental research design that economists consider one of the “credibly causal” research designs, provided that its assumptions are tested and met. For example, in the Annual Review of Economics in 2022, Matias Cattaneo and Rocío Titiunik state, “RD design has become one of the most credible nonexperimental methods for causal inference and program evaluation.” If well implemented, studies using RD can meet the requirements of the federal Every Student Succeeds Act (ESSA) “second tier” of evidence, just behind a well-implemented randomized controlled trial (RCT), which is the “gold standard” for estimating the causal effect of public programs and policies.
[9]Appendix B describes the procedure for creating the “running variable” that captures schools’ proximity to the QCSSF Criterion 1 performance threshold.
[10]Appendix C describes the statistical model I use to implement the RD design. The model estimates the change in probability of receiving an award based on being above or below the Criterion 1 cutoff, and it uses this change in probability to estimate the program’s impact on outcomes measured at the school level.
[11] Table D4 in Appendix D presents “balance tests” comparing the characteristics of schools at baseline. It reveals that, among the analytic sample (which employ bandwidths of 1, 1.5, and 2 points, respectively), there are three to five variables for which there is an imbalance as of 2018–19 that reaches or approaches conventional levels of statistical significance. These differences are primarily related to teacher degrees and years of experience and, correspondingly, classroom spending. Schools that just qualified for the program also had slightly lower performance index scores than those that narrowly failed to qualify. However, because this analysis examines changes in inputs and outcomes since 2018–19, these imbalances are not concerning for these particular variables (setting aside any validity concerns they present for this study’s design).
[12] Among schools in the sample (those with sufficient data), none closed if they received a QCSSF award. Nine schools closed that were eligible but did not meet Criterion 1. However, among schools near the QCSSF performance requirement, those that failed to receive an award were no more likely to close than those that received an award. See Table D11 in Appendix D for the results of the school closure analysis.
[13] See Table D10 in Appendix D for the results of the composition analysis.
[14]Appendix A describes how I created the value-added variables used in this analysis by modifying the publicly available school value-added data on ODEW’s report card data page.
[15] To capture total achievement gains between 2018–19 and 2022–23 using annual value-added measures, one would need to add up those annual gains in achievement for each of the intervening years. This is impossible because there are no valid value-added estimates available for 2019–20 and 2020–21. Thus, although comparing a school’s value-added between 2018–19 and 2022–23 (what this study does) is a good way to capture changes in school quality, it is not a good way to capture accumulated achievement gains from 2018–19 and 2022–23.
[16]Appendix C describes the procedure for selecting the bandwidths as well as the details of the statistical model. The “optimal” bandwidth generated using the procedure suggested by Calonico, Cattaneo, and Titiunik (2014) is generally between 0.3 and 0.7 standard deviations.
[17] The results of “reduced-form” models reported in Table D5 and Table D6 of Appendix D illustrate how the estimates are similar regardless of the bandwidth or inclusion of controls, though the primary models that account for location district “fixed effects” and baseline values of the outcomes yield results that are far more statistically precise. The main analysis of QCSSF’s impact (the “fuzzy RD” analysis reported in Table D7 and Table D8) uses the larger samples and full specifications to maximize statistical power.
[18] The tabular results appear in Table D7, Table D8, and Table D9 in Appendix D. The preferred analytic samples are based on bandwidths of 1, 1.5, and 2 standard deviations from the performance threshold for QCSSF funding. Estimates based on smaller bandwidths are not featured because those sample sizes are small and lead to underpowered models. Although the “optimal” bandwidth (in terms of balancing concerns for potential bias and precision) is generally between 0.3 and 0.7, estimates using those samples are too imprecise to detect expected effects (e.g., an increase of $1,500 in spending). Fortunately, as Table D5 and Table D6 of Appendix D illustrate, the coefficient estimates are broadly similar when using a bandwidth of 0.3 standard deviations.
[19] Whereas statistical significance in the appendix tables is based on two-tailed hypothesis tests, I focus on a one-tailed test in reporting these results because the impact of QCSSF funding arguably warrants a directional hypothesis. In other words, the expectation is that narrowly qualifying for the program, as opposed to narrowly failing to qualify, should lead to increases in expenditures.
[20] The 2022–23 average is based on the largest analytic sample within two standard deviations of the performance threshold.
[21] Table D10 in Appendix D indicates no significant changes in student composition, including enrollments.
[22] That is not exactly correct, as QCSSF schools may add teachers without losing existing teachers. However, a measure that focuses on the proportion of teachers no longer in the school (the proportion of teachers who, in the prior year, were in their last year) yields comparable results.
[23] These composites weight 2021–22 and 2022–23 equally, whereas the value-added estimates on the 2022–23 report card put twice as much weight on 2022–23. The estimates in this analysis are also scaled differently and, thus, are about half the size of the estimates one would get using the “effect size” measure from the report card. Appendix A provides a more thorough explanation of the value-added measure used in this analysis.
[24]Hill et al. (2007) find that students typically experience annual achievement gains of 0.314 of a standard deviation in reading and 0.422 of a standard deviation in math in grades 4–8. Dividing the estimates in Figure 2 by these typical growth rates yields a fraction of a school year, which one can multiply by 180 (the typical number of instructional days in a school year) to get annual days’ worth of additional learning.
[25] For example, see Matthew D. Baird and John F. Pane, “Translating standardized effects of education programs into more interpretable metrics,” Educational Researcher 48, no. 4 (2019): 217–28, https://doi.org/10.3102/0013189X19848729.
[26] Although the performance index is calculated on a 0–120 scale, it is also reported on Ohio reported card data in percentage terms which facilitates interpretation and comparisons across schools and over time.
[27] As Table D8 and D9 of Appendix D indicate, the increase in the performance index is primarily in 2022–23 and the average effect when 2021–22 and 2022–23 are pooled is about half as large. These results parallel those from the value-added analysis. If school improvements had occurred in 2019–20 and 2020–21, the mean effect for the performance index would have been larger and would not have changed so significantly between 2021–22 and 2022–23.
[28] See the 2022–23 estimates in Tables D8 and D9 in Appendix D. All models in this study capture changes in outcomes since 2018–19 for schools that narrowly qualified and those that narrowly failed to qualify across the entire program history. Although value-added estimates are limited to learning that occurred within a given school year (2021–22 or 2022–23, which the analysis above averages together), the performance index captures a school-wide achievement level that can be compared between two points in time. Because the estimates for the performance index are insensitive to controlling for changes in student composition and the results line up with those of the value-added estimates, it is reasonable to conclude that student learning in QCSSF schools increased by approximately seven percentage points—about half of the standard deviation in charter school performance—over the life of the program.
[29] See Table D2 and D3 in the appendix to get a sense for trends among schools close to the QCSSF performance cutoff (within 0.3 points). This does not exactly capture schools at the cutoff, but it comes close.
[30] Seth Gershenson, “Linking teacher quality, student attendance, and student achievement,” Education Finance and Policy 11, 2 (2016): 125–49; Michael A. Gottfried and Ethan L. Hutt, editors, Absent from school: Understanding and addressing student absenteeism (Cambridge, MA: Harvard Education Press, 2019); C. Kirabo Jackson, “What do test scores miss? The importance of teacher effects on non–test score outcomes,”
Journal of Political Economy 126, no. 5 (2018): 2072–107; Evan K. Rose, Jonathan T. Schellenberg, and Yotam Shem-Tov, “The effects of teacher quality on adult criminal justice contact” (working paper no. 30274, National Bureau of Economic Research, July 2022), https://www.nber.org/papers/w30274; Long Tran and Seth Gershenson, “Experimental estimates of the student attendance production function,” Educational Evaluation and Policy Analysis 43, no. 2 (2021): 183–99.
[31] As Table D1 in Appendix D reveals, for schools near the QCSSF cutoff, attendance and chronic absenteeism were 91 and 31 percent, respectively, just prior to the pandemic (as of 2018–19). As Table D3 reveals, even among QCSSF schools near the cutoff, attendance rates had declined to 88 percent and chronic absenteeism rates had increased to over 50 percent. Thus, as with student achievement, one might characterize the impact of funding as mitigating the pandemic’s negative impact on student attendance.
[32] Tran and Gershenson, “Experimental estimates of the student attendance production function.”
[33] The impact of $1,000 in spending (as opposed to the full QCSSF award amount) is evident in the results of the “reduced form” models in Tables D5 and D6 in Appendix D.
[34] Jackson and Mackevicius, “What Impacts Can We Expect from School Spending Policy?”
[37] Jackson and Mackevicius, “What Impacts Can We Expect from School Spending Policy?”
[38] Lavertu, Ohio charter schools after the pandemic.
[39] The RD plots in Appendix E illustrate this divergence in charter school effectiveness. Figure E1 reveals that schools near the QCSSF performance cutoff had value-added scores of about zero as of 2018–19, which means that they were roughly comparable in effectiveness to the average Ohio public school. However, in 2022–23, schools that narrowly qualified had value-added scores of approximately 0.05 (indicating that they have gained ground on the average Ohio public school) whereas those that narrowly failed to qualify had scores of -0.05 (indicating that they lost ground compared to the average Ohio public school). These losses are driven by schools that just missed the award. If one omits these schools (using a “donut hole”) or lessens their pull by pooling the value-added estimates across 2021–22 and 2022–23 (as the analysis does), then the losses appear less pronounced (see Figure E2)—though the local means still indicate larger losses among narrow losers than gains among narrow winners.
[40] Dan Goldhaber, Grace Falken, and Roddy Theobald, “ESSER Funding and School System Jobs: Evidence from Job Posting Data” (Working Paper No. 297-0424, Center for Analysis of Longitudinal Data in Education Research, Arlington, VA, 2024), https://caldercenter.org/sites/default/files/CALDER%20WP%20297-0424.pdf.
[41] As of the 2018–19 school year, just prior to the implementation of the QCSSF program, teacher salaries in traditional public schools were, on average, nearly $30,000 higher in traditional public schools than in charter schools in the same district. See online appendix for Lavertu and Tran, “For-profit milk in nonprofit cartons?”
[42] Tight labor markets also raise the possibility that charter schools with QCSSF funding outcompeted nearby charters for teachers, thus simultaneously enhancing their performance while driving down the performance of non-QCSSF schools.
[43] Elena Doty et al., “What do changes in state test scores imply for later life outcomes?” (Working Paper No. 30701, National Bureau of Economic Research, Cambridge, MA, 2022), https://www.nber.org/papers/w30701; Eric A. Hanushek, Jens Ruhose, and Ludger Woessmann, “Knowledge Capital and Aggregate Income Differences: Development Accounting for U.S. States,” American Economic Journal: Macroeconomics 9, no. 4 (2017): 184–224.
[44] Carolyn Abott et al., “School district operational spending and student outcomes: Evidence from tax elections in seven states,” Journal of Public Economics 183 (2020): 104142, https://doi.org/10.1016/j.jpubeco.2020.104142.
[45] C. Kirabo Jackson et al., “School Effects on Socioemotional Development, School-Based Arrests, and Educational Attainment,” American Economic Review: Insights 2, no. 4 (2020): 491–508; Rose, Schellenberg, and Shem-Tov, “The effects of teacher quality on adult criminal justice contact.”
Grade inflation is the increase in average grades awarded to students over time, absent higher academic achievement. When students get higher grades without having learned more, their grades are being inflated. The changing meaning of grades distorts the central measure of student academic performance, as various stakeholders, from parents to college admissions officers, lose valuable information that can be used to allocate resources, identify students’ strengths and weaknesses, or hold students accountable for their schoolwork.
Worries about whether course grades are communicating distorted information are nothing new. Educational psychologist Guy Montrose Whipple wrote in 1913 that the “marking system” was “an absolutely uncalibrated instrument.” In our day, study after study shows that grades are soaring. Data from the ACT show that, in 2021, the composite score was the worst of any year reported going back a decade, but that same year, ACT test-takers logged the highest average grade point average (GPA) ever recorded. Since the beginning of the pandemic, even students at the 25th percentile of ACT performance (i.e., students performing far below average) have boasted GPAs above 3.0. That means these low-performing students received more A grades than Cs, Ds, and Fs combined. A pre-pandemic Fordham report that used data from North Carolina showed that more than one-third of students receiving B grades in Algebra I failed to achieve proficiency on the end-of-course exam, and post-pandemic studies from North Carolina and Washington State have shown further upticks in grade inflation in the wake of the pandemic.
Since grades have no intrinsic meaning, it can be tempting to believe that grade inflation doesn’t really matter. For example, a 2013 article in Educational Researcher argued that the “sky was not falling” when it came to grade inflation because cumulative GPA continued to be a good statistical predictor of college success. Princeton economist Zachary Bleemer has advanced the controversial argument that grade inflation is often good because students feel better about receiving good grades and that makes them more likely to persist with their education. Yet grade inflation apologists ignore the longer-term consequences of demeaning the value of grades. In the moment, a higher grade feels good, even if unearned, but as everyone’s expectations around grades are updated, ever higher grades are needed to achieve the same good feelings. Ultimately, the fact that there is no grade higher than an A means that rising GPAs lead to “grade compression,” where the overwhelming majority of GPAs fall within a narrower and narrower range. That’s mostly likely why a 2024 report found that test scores are now better predictors of academic success than grades, at least for the elite colleges included in the study.
But college admissions officers aren’t the only stakeholders in the grading system. Especially in earlier grades, parents are even more important. They can offer their children resources, such as tutoring, or help hold them accountable for their academics. To judge how their children are doing in school, parents tend to focus on grades given by teachers rather than other measures. Learning Heroes, for example, has surveyed parents and found that grades on report cards are the most important factor for parents when determining whether their children are succeeding at school. As grades have risen, the connection between parental views of their children’s achievement and actual academic achievement has come unhinged. For example, several post-Pandemic surveys by Learning Heroes have shown that, even in this era of devastating learning loss, about four in five parents say their child is taking home mostly A’s and B’s. As a recent report issued jointly from TNTP and Learning Heroes put it, “grades are sending signals that students are doing well at a time when there is serious reason for concern.”
It is common sense that students will study less and learn less when they have less motivation and pressure from their parents, and academic studies bear out this intuition. A 2020 Fordham Institute report used data from North Carolina to show that students assigned to Algebra I teachers who awarded relatively high grades learned less in the course. In fact, students with more lenient Algebra I teachers performed worse on assessments for subsequent math courses, such as Geometry. (An earlier study of elementary classrooms by economists David Figlio and Maurice Lucas showed a similar pattern.) A 2010 study of college students by economist Phillip Babcock shows how student effort is a crucial link between grading standards and student outcomes: When students perceive a course as being graded more strictly, they report spending more time studying for the course. In other words, students themselves respond to grading pressures, modifying their studying behavior.
In practice, grade inflation can actually exacerbate educational inequalities. After all, if students in less affluent schools receive higher grades that do not reflect their actual performance, as research indicates is happening, they may not receive the support and interventions they need to improve. A 2023 working paper used North Carolina data to show that, when a more lenient grading scale was introduced, short-term achievement was mostly unaffected but absenteeism ticked up among initially lower-performing students, indicating decreased engagement.
Keeping grade inflation in check
We aren’t going back to an age when the average high school GPA was 2.0 (see Figure 1), just as we aren’t going back to an economy where a vending machine Coke costs a dime. As long as we preserve the current A-to-F grading scale, probably the best we can hope for is to reinstitute the norms that were in place in recent memory, such as immediately prior to the pandemic, and hold the line thereafter.
Figure 1. Mean GPA of students taking the SAT in the 1960s and 1970s, by sex and year.
To keep grade inflation within bounds, policymakers and educators should rely on three principles:
1. Develop external checks on grades.
Grade inflation is often invisible, as it is only possible to observe when grades diverge from other measures of learning. Analysts can use a range of metrics—and can even conduct in-depth academic audits—but end-of-course exams are especially valuable as external checks. When students take external exams over the same content as a course, the results provide a more objective measure of their mastery of the material, enabling analyses that can help detect grade inflation that would otherwise go unnoticed.
2. Promote transparency in grading standards.
Once grading standards are identified, the information has to be put to use. Placing grades within their context, especially by comparing them with other measures of learning, generates new data that should be provided to parents, circulated among teachers, and in some cases, included on student transcripts.
3. Avoid inflationary grading reforms.
Reforming grading has been a hot topic in recent years. Employing rubrics for grading, implementing anonymous grading, and eliminating fluffy extra credit points are all promising reforms, and importantly, they neither raise nor lower students’ grades. (Eliminating some extra credit probablyraises academic standards.) These particular grading reforms stand in contrast to more controversial ones, such as eliminating penalties for late work, arbitrarily assigning grades of 50 in place of zeros, and removing all deadlines. The latter practices all weaken student accountability and further fuel grade inflation.
—
Grade inflation may seem like an abstract challenge, one that is far-removed from the question of how to best operate effective schools. In fact, grading policies influence the signals grades communicate and thus the environment in which teachers teach and students study, shaping their behavior and relationships. By utilizing external assessments, increasing transparency, and supporting reforms that preserve the rigor of grading standards, we can stabilize the value of letter grades and ensure that their meaning is clear to all stakeholders.
Predicting the future is often compared to reading tea leaves. In the case of forecasting what education policies Vice President Kamala Harris might pursue as president, though, a more apt analogy might be reading her mind. Frankly it’s anyone’s guess what her education policies would be, given how few clues we have.
It wasn’t always this way. Most recent presidential candidates laid out detailed plans for schools, dating back at least to Kennedy and Johnson. George H. W. Bush wanted to be the education president. Bill Clinton wanted to use stronger schools to build a bridge to the twenty-first century. George W. Bush wanted to leave no child behind and move the Republican party in a more compassionate direction. Barack Obama wanted Democrats to break with teachers unions by embracing merit pay. Each of them amplified these slogans with various plans and proposals.
But in more recent cycles, education has dropped from the list of voters’ top-tier issues, and candidates have become increasingly cagey about their plans.
Donald Trump’s administration was known for its advocacy of school choice, but that wasn’t something he talked much about on the campaign trail in 2015 or 2016. It only came into focus with his selection of Betsy DeVos as secretary of education.
And Joe Biden’s unwillingness to challenge progressive orthodoxy on education would have been hard to predict, given his moderate persona in 2019 and 2020. What turned out to be the best guide to his education policies was his self-identity as the “most union-friendly president in history”—plus the membership of his wife, community college professor Jill Biden, in the National Education Association.
So here we are with another election in which education issues are barely registering, trying to predict what Harris might do if elected. She has said even less than Trump or Biden, partly because of the truncated nature of her campaign, and partly because of her strategy of leaning into positive vibes and declining to offer policy specifics in the hope that doing so will better her chances of prevailing in November. Official statements—a Harris campaign policy document and the Democratic Party Platform—are thin on details.
Making things even harder is Harris’s well-known willingness to run away from previous positions. She did that in 2019 when the Black Lives Matter movement made it awkward for her to embrace her record in law enforcement—including her tough stance on prosecuting parents of truant children.
That’s why looking at Harris’s statements from the campaign trail five years ago or her record as a U.S. senator only goes so far.
What we do know is this: She’s the sitting vice president. She has positioned herself in the middle of the Democratic Party, not wanting to break with progressives on the left or business-friendly centrists in the middle.
And while her image is not blue-collar like Biden’s, she’s been careful not to put any sunlight between herself and the unions, including teachers unions. One of her first speeches as the presumptive Democratic nominee was to the American Federation of Teachers, the very first union to endorse her.
For these reasons, it is likely that a Harris administration would bring significant continuity with Biden’s policies, including on schools.
Picture her appointing a former teacher as secretary of education, proposing healthy increases in school spending, and speaking out against privatization, book bans, and the like. Call it the Hippocratic Oath approach to Democratic policymaking on education: First, do no harm.
Can those of us involved in K–12 education hope for bolder strokes from a President Harris—including some that might move the needle on reform? Of course one can hope.
The most significant play we might anticipate, though, could be on teacher pay. Boosting teacher salaries by $13,500 per year (to close the gap with other professionals) was the centerpiece of her education agenda when she ran for president in 2019.
She has a ready vehicle to pursue it, thanks to the looming expiration of Trump’s 2017 tax cuts, which makes new legislation around tax reform a must-pass item for Congress next year. The most straightforward way for the federal government to put more money into teachers’ pockets isn’t through a complicated grant program to states and districts, but via tax credits that would flow directly to educators.
A major teacher tax credit could quickly get expensive, however, given the size of America’s teaching force (3 to 4 million depending on how you count it). At, say, $10,000 per teacher, that’s $30 to $40 billion a year—in the neighborhood of what we spend on Title I and IDEA combined.
A smarter, more affordable approach would be to target teachers serving in high-need schools — as the student loan forgiveness programs already do. Studies from Dallas and elsewhere acknowledge that great teachers will move to high-poverty schools—but only if offered significantly higher pay—in the neighborhood of $10,000 more per year.
We also know that when we pay teachers the same regardless of where they teach—the policy of almost every school district in the country—the neediest schools end up with the least-experienced teachers.
A tax credit for teachers in Title I schools—which get government funding for having high numbers or high percentages of students from low-income families—could transform the profession overnight, significantly narrowing the teacher quality gap, school funding gap, and perhaps, the achievement gap, too.
Given Democrats’ interest in boosting the “care economy,” perhaps such a tax credit could flow to instructors in high-poverty childcare and pre-K centers, as well. This would fit well with Harris’s promise to move America toward an “opportunity economy,” including by boosting the pay of childcare and preschool teachers.
Here’s the best news: Senator Cory Booker has already written a well-designed bill to create precisely this sort of teacher tax credit, with most of the benefits flowing to teachers in the highest-poverty district and charter schools. When the tax reform sausage is being made next year, this bill should be thrown into the mix.
Still, a big effort on “differential pay” for teachers might be just one wonk’s wish-casting. We’ve had two presidential administrations in a row with little action on K–12 education. It’s quite likely that a Harris administration would be a third.
But here’s hoping for a pleasant surprise after November.
The American dream is still alive and can be achieved in just one generation, even among the most economically disadvantaged young people. That finding is among the most promising takeaways from new research produced by Harvard University’s Raj Chetty and his collaborators. These analysts studied economic outcomes for 57 million children born between 1978 and 1992, whose income was then measured in adulthood at age twenty-seven, from 2005 to 2019, and found gains across racial lines.
“Earnings increased for Black children at all parental income levels, reducing white-Black earnings gaps for children from low-income families by 30%,” say the paper’s authors. It is encouraging that the Black-white gap in upward mobility shrank so significantly in just the past fifteen years, even though overall racial gaps remain.
While a person’s race matters increasingly less, there are community level characteristics whose absence or presence more directly drive individual prospects of upward or downward mobility. Indeed, this study finds a neighborhood–level factor that principally drives access to the American dream.
Lead researcher Benjamin Goldman explained at a Brookings Institution presentation: “In economics, there is something called a one-factor model where…a single variable seems to fully explain why outcomes have changed differently for different race and class groups over the last fifteen years.”
That single variable? The rate of working parents in a given community.
Communities that experienced increases in parental employment had the largest increases in household income for children when they reached age twenty-seven, regardless of the employment status of a parent in an individual child’s household.
This is a substantial finding, but not entirely new. Sociologist William Julius Wilson showed in two of his seminal books, “The Declining Significance of Race” and “When Work Disappears: The World of the New Urban Poor,” that unemployment levels within a given community far outweigh race as a determining factor for mobility. In his view, fewer adults in the labor force is a catalyst for many of the social ills that further deteriorate economic opportunity for children.
In 1996, Wilson wrote in “When Work Disappears,” “Many of today’s problems…crime, family dissolution, welfare, low levels of social organization, and so on…are fundamentally a consequence of the disappearance of work.” While Wilson emphasized the devastating impact of joblessness in a community, Chetty’s 2024 study highlights the positive corollary: the constructive effect that low unemployment rates in a particular community have for young people growing up there.
Robert Doar, president of the American Enterprise Institute, highlighted one of the study’s implications on crafting social welfare policy: “There is a real significance to (adult) employment in the community, for kids, for their future upward mobility.... That importance would make you skeptical of cash benefits that don’t have a work connection.”
In other words, policymakers should deemphasize cash transfer programs that provide public dollars directly to individuals, unconditionally. These “no-strings-attached” programs often have the behavioral effect of discouraging work. Instead, pursue programs that require labor force participation in order to receive benefits.
While the study reaffirms that high levels of parental employment in the community is a huge factor in determining future economic mobility for children, there are other forces at play, too. Positive changes in a community’s marriage rate are additionally correlated to greater earnings power for children who grew up in the community, regardless of the marital status of an individual child’s household.
This is also a substantial finding, and also not entirely new. Chetty’s 2024 insights on the correlation between neighborhood marriage rates and prospects for economic advancement echo those found in his 2014 study entitled “Where is the Land of Opportunity?”
The study described the United States as a “collection of societies, some of which are ‘lands of opportunity’ with high rates of mobility across generations, and others that experience persistent inequality, places in which few children escape poverty.” Among the findings was that “the strongest predictors of upward mobility are measures of family structure such as the fraction of single parents in the area.”
Knowing that high rates of work and marriage among adults likely drives upward mobility for children, what should we do? Mandating work requirements in order to receive government benefits is an effective lever that can be deployed today to incentivize greater adult employment, especially in communities where parents’ employment rates are low. But increasing marriage rates is far trickier and out of reach for most public policy, especially for adults who are already struggling.
But what of the rising generation? Young people today are turning away from marriage and subsequent family formation as a formal life course. According to a 2023 Pew Research Survey, among all adults ages eighteen to thirty-four, only 20 percent say being married is extremely or very important for people to live a fulfilling life, and only 22 percent say having children is extremely or very important. These attitudes have played out in practice: America and other nations continue to experience plummeting birth rates and marriage rates.
Moreover, a growing mass of able-bodied young men, and some young women, are simply opting out of the labor force, as my American Enterprise Institute colleague Nicholas Eberstadt has discovered.
Unless these trends are reversed, a generation from now, we will have recreated the same types of low-work, low-marriage-rate conditions in communities that “experience persistent inequality, places in which few children escape poverty.”
It does not have to be this way.
Young people between the ages of fourteen and twenty-four, during what I call the “Deciding Decade,” are making critical decisions about relationships, work, family formation, education, faith and more. The choices made during this timeframe, while not completely determinative, will have huge influence over a child’s life trajectory of happiness, economic outcomes, and ultimate fulfillment—where one chooses to go to school and how hard they work, what job they seek, what friendships they pursue, who they fall in love with, what religious faith they choose to practice or not, and so forth.
The more these choices are made with conscious volition, the greater the likelihood young people will lead self-determined lives of meaning and purpose. Three policy areas could facilitate these outcomes.
First, we should be advocating for more educational freedom, which encompasses school choice like public charter schools and the types of vouchers, tuition tax credits, and educational savings accounts that can be used by parents to access private or parochial schools and craft customized educational experiences for their children.
Second, students—particularly in high school—should be disabused of the notion that college is the only immediate option after high school. The model of secondary school education should be expanded to include work/study apprenticeship-type models that allow students in their junior and senior years to earn industry credentials in skilled trades like carpentry, welding, or electrical, or pursue high-growth careers that don’t necessarily require a college degree, like entry-level opportunities in health care or computer science.
Finally, to shift attitudes and enhance the attractiveness of marriage, schools should integrate curricula that encourages students to develop healthy relationships and make key life decisions based on the data associated with research often referred to as the “Success Sequence”—that 97 percent of millennials who have completed their education, secured full-time work, gotten married, and then had children (in that order) avoid poverty and positively enter the middle class or beyond.
To reiterate: Reports of the death of the American dream have been greatly exaggerated. But keeping it alive requires policymakers and elected officials to be more hard-nosed about demanding work today in return for public benefits, while also using the bully pulpit to shift the rising generation’s attitudes toward marriage and family formation tomorrow.
Rebuilding opportunity is a national challenge, but change happens locally. It requires buy-in from institutions, including families, schools, houses of worship, civil society organizations, and businesses to support local solutions, particularly around strengthening work and marriage.
The good news, according to this study, is that breaking the intergenerational cycle of poverty is achievable within the span of just fifteen years, less than one generation. Rather than being told systems are rigged for failure, young people—especially those that are struggling—should know that their lives can be their own.
Editor's note: This was first published by Deseret News.
New reports show that charter schools reduce academic inequality and raise the overall quality of public schools—and that opening more of them will continue this virtuous improvement cycle. —Bruno V. Manno, Forbes
A Progressive Policy Institute study reveals that “cities with high charter school enrollment have consistently improved achievement for low-income students” in both charters and traditional public schools. —Reason
Jeers
A local Democratic Party leader in New Jersey secured two public school jobs while serving time for federal tax fraud, underscoring persistent concerns about corruption in the state. —New York Times
Nevada’s new AI system for identifying at-risk students has dramatically decreased the number of students identified, leading to funding cuts and ethical questions over who deserves additional support. —The New York Times
Arizona offers school vouchers for all—but barriers such as the additional costs of transportation, tuition, and meals are discouraging many low-income families from using them. —Eli Hager and Lucas Waldron, ProPublica
The rapid growth of higher education in developing nations has led to a mismatch between the number of graduates and available jobs, contributing to high rates of unemployment and illegal migration within the college-educated population. —Jon Emont, Wall Street Journal
A new study suggests that charter school expansion and abuse scandals have led to the decline of Catholic schools in recent decades. Could ESAs put a stop to this trend? —The 74
The control variables typically included in observational studies are not enough to account for the complex factors that motivate human choice, suggesting that causal claims made in such studies may be unreliable. —Emily Oster, Parent Data
Trump’s vision for education, which includes both closing the Department of Education and expanding the federal role to review curricula for “wokeness,” is contradictory and difficult to implement. Harris’s is too vague, with few specifics and a focus instead on attacking Trump’s plan. —Laura Meckler, The Washington Post
We learned in yesterday’s meeting of the Columbus City Schools’ board that they have hired an outside consultant to help them move forward on their school closure process. The Dispatch article is concerned about how much he’ll be charging (whatever it is, it’s too much; I would have done it for you for bus fare). Of more concern to me are the wishy-washy comments of several elected board members regarding the process writ large—which has already gone on too long with nothing to show but a list of school names on paper and a whole lot of electronic ink spilled. They are making things too complicated, I reckon, and I get the impression it’s being done that way on purpose. (Columbus Dispatch, 10/16/24) Speaking of overcomplication, the elected board that runs Dayton City Schools has engaged some outside folks for a project—at what seems a reasonable price, given other recent district spending patterns—but this still feels like a waste of time and money to me for several reasons. The stated goal is “creating a space where students in the district can graduate ready to work in well-paying jobs in the area” which, if you ask me (they did not—no one ever does) actually has a very simple solution that can be implemented without spending any additional money: Teach your kids to a higher standard. Take it from me for free—if your high schoolers are good at math and writing and science, they’ll find colleges and employers clamoring to claim them before they even graduate. The other problem is that the actual goal—what their consultants will be producing for them—is some kind of “durable system” (of…something) that is intended to stay in place even as boards and superintendents and central office staffers change. As the current supe explains it, “[W]hat can we put in place that outlasts people?” To which I say: All the money in the world will not Dayton City Schools anything like that, sir. And even if you get it, it will have no positive impact on your students whatsoever. (Dayton Daily News, 10/15/24)
Fordham recently published results from a parent survey on educational opportunity in Ohio. Produced in partnership with 50CAN and Edge Research, the nationwide survey collected feedback from more than 20,000 parents and guardians of school-aged children in all fifty states and the District of Columbia. In Ohio, the survey was completed by 408 parents. Here’s an overview of three significant takeaways from the Ohio-specific results.
1. Ohio parents have access to school choice and like it.
Over the last few decades, Ohio policymakers have consistently expanded school choice options for families. The most recent state budget, for example, broadened voucher eligibility, increased funding for charter schools, and improved transportation guidelines (though recent events prove there’s still plenty of work to do on that front). According to survey results, those efforts have paid off. On average, 68 percent of parents feel like they have a choice in what school their child attends, slightly higher than the national average of 65 percent. More than two-thirds (67 percent) say they would make the same choice again. It’s telling that the vast majority of Ohio parents say that they feel like they have a choice and would make the same choice again.
Even better, a whopping 71 percent of low-income parents in Ohio reported feeling like they have a choice compared to 61 percent nationally. Historically, economically disadvantaged families have the least amount of choice. That’s because high-performing districts require residency for enrollment, and real estate prices are typically beyond the financial reach of many families. Private school tuition is also steep. But in Ohio, options like charter schools (which posted stronger academic growth numbers than their district counterparts this year), private school scholarship programs, and open enrollment provide low-income families in all communities—urban, suburban, and rural—with more choices than they would have otherwise. Families can, and often do, choose to enroll their children in their assigned traditional public school. But Ohio parents saying they feel like they have options matters because choice shouldn’t be limited by income.
2. Few parents seem to be consuming Ohio’s information on schools.
Each year, Ohio’s school report cards offer families, stakeholders, and policymakers a wide variety of data about school performance and student outcomes. Unfortunately, survey results indicate that many Ohio parents may not realize they have a treasure trove of data at their fingertips. On average, only 28 percent said they reviewed information about their school’s performance compared to other schools. That matches the national average, but twenty-four states registered higher percentages. Even worse, there’s an alarming gap between families of different income levels. More than a third (34 percent) of middle- and high-income parents reported that they reviewed school performance data. But less than a fifth (18 percent) of low-income parents did so. That means low-income families are viewing school performance data at roughly half the rate of higher-income families.
It’s unclear what’s causing this gap. It could be that families are not aware of school report cards, let alone choosing to use them to compare school performance. But it’s also possible that they feel Ohio’s system is difficult to navigate or that online report cards aren’t as user-friendly as they could be. Either way, state and local leaders need to do a better job of communicating with parents, especially those in low-income communities.
3. Most Ohio parents don’t believe their children are ready for the next step.
Ensuring that students are ready for what comes after graduation is a top priority for families and one of the chief responsibilities of schools. But survey results show that Ohio parents have some concerns. Only 39 percent are “extremely confident” that their child will be well equipped to succeed in the workforce, and only 36 percent feel that way about college preparation. On the one hand, both numbers are higher than the national average, which is 34 percent for workforce preparation and 32 percent for college. That means Ohio fell into the top tier among other states. But on the other hand, only a third of Ohio parents believe that their kids are ready for the next step, whether it's work or higher education. That’s worrisome.
***
Results from parent surveys like this are incredibly useful for policymakers. They pinpoint areas where policies and programs have been successful, like school choice expansion. They also identify areas for growth, like ensuring parents are aware of school performance data and prioritizing initiatives aimed at improving student readiness. Here’s hoping Ohio leaders pay close attention to these results as they head into budget season.