In just a few short months, Ohio lawmakers will be knee-deep in the state budget for FY 2026 and 2027. A large portion of the budget is K–12 education, and Ohio’s school funding model is sure to be a topic of discussion.
In just a few short months, Ohio lawmakers will be knee-deep in the state budget for FY 2026 and 2027. A large portion of the budget is K–12 education, and Ohio’s school funding model is sure to be a topic of discussion. This model, named the “Cupp-Patterson plan” after its legislative champions former Speaker Bob Cupp and Representative John Patterson, determines how state dollars are allocated to public schools using a complex formula that takes into account student enrollments and districts’ capacity to raise funds locally.
The Cupp-Patterson plan was heralded as a “fair” and “constitutional” replacement for the state’s previous formula, which was roundly criticized as a “patchwork” and “unpredictable.” The new formula does indeed have strengths, and follows some of the right ideas in crafting school funding policy, including centering the system around where students actually attend school, providing additional resources for less advantaged pupils, and allocating state aid in a “progressive” manner to support schools with less local wealth to tap into. But after four years of implementation, which included a period of sky-high inflation, it’s clear there is still room for improvement. I’ve previouslydiscussedtheseproblematicareas in great detail, but a consolidated inventory, along with some updated analysis, could be helpful as policymakers head into budget season.
So, let’s take a look at three of the six items that should be on their fix-it list (a follow-up piece will cover the rest).
Issue 1: Base-cost model and escalating costs
When the Cupp-Patterson plan was first unveiled, the projected cost of full implementation was $2 billion in additional state spending—a roughly 20 percent increase above state K–12 education expenditures at the time. Given the price tag, lawmakers chose to phase in the increase over six years. Thus, public schools have received increased funding over the past four years, though they’re not quite up to the full amounts prescribed by the formula. Even with the phase-in, spending has ramped up quickly under the new formula. Relative to a pre-Cupp-Patterson baseline of $8.5 billion in FY21, Ohio has increased formula funding by $1.5 billion by FY25. All three sectors of the public school system have enjoyed higher funding amounts, with traditional district funding up by $1.1 billion, public charter and STEM schools up by $212 million, and joint-vocational school districts (JVSD) up by $180 million.
Inflation has taken a bite out of these gains, but growth in state formula aid still outpaces inflation over this period. On a per-pupil basis, the nominal increase has been 19 percent from FY21 to FY25, but adjusting for inflation (using CPI), formula funding has risen by a more modest 3 percent. That said, public schools have also received unprecedented federal Covid-relief aid over the last few years, and these numbers don’t include several non-formula funding streams that have pushed additional dollars into Ohio schools (e.g., literacy funding and increased charter supports).
Figure 1: State formula funding, FY21 to FY25
If the past is prologue, lawmakers are looking at another $500 million to $1 billion increase in the upcoming budget in order to continue implementing Cupp-Patterson. Some of the cost would reflect the final two-year phase-in of the plan, but a bigger factor is whether they choose to update the “inputs” of the formula’s base-cost model. This part of the formula includes detailed calculations that rely on average teacher salaries and other items[1] to generate base per-pupil amounts.[2] In the last budget cycle, this update was quite expensive, as salaries and other expenditures rose significantly (thanks again to inflation) and drove the $1 billion increase from FY23 to FY25.[3] Another update would push the overall cost of Cupp-Patterson upward yet again, likely over the $2 billion mark.
The large expense of implementing Cupp-Patterson—and maintaining it via input updates—has always opened questions about its long-term sustainability. So far, lawmakers have been able to dodge the issue of cost as the state’s coffers have been full in recent years. But boom times end, and lawmakers—whether next year or sometime further down the road—may be less inclined to approve such sizeable increases in a sluggish economy. Other factors could also lead to a budget crunch, including a desire to reduce tax burdens or to increase spending on other programs such as childcare, public sector pensions, and healthcare.
Questions remain: What happens to Cupp-Patterson when there’s a pinch? Will lawmakers resort to imposing much-despised “caps“ to control spending? Will they skip an input update, perhaps punting that decision and its costs to future legislators and thereby throw a wrench into the formula? Would they strip essential formula elements to reduce its cost? Or will they shortchange other government programs to cover the expense?
It won’t be politically popular, but the next General Assembly should explore options that better control the long-term costs of Cupp-Patterson. One option is to reinstate a straightforward base per-pupil amount that is determined biennially by legislators. Another is to tie the base to the statutory minimum salary schedule instead of using average employee salaries, which can inflate quickly when districts pass levies and their local revenues grow. Neither idea implies flat or reduced funding, and lawmakers could always increase funding levels, something they have consistently done in the past. They might even give a nod to Cupp-Patterson by setting the base amount just above the current level and allowing that to rise consistent with the general inflation rate. These types of changes would give the legislature more flexibility to adjust the state’s main education appropriation without being hamstrung by the prescriptiveness of the Cupp-Patterson base-cost model. And as discussed previously and in my follow-up piece, lawmakers should also trim expensive guarantees from the formula, which provide excess funding to districts with declining enrollments.
Issue 2: State-share mechanism and inflation
The formula includes calculations that determine how much of the base amount the state is responsible for funding and how much is assumed to be funded locally (which includes a state-required minimum 2 percent property tax). These computations are based on property values and resident incomes, and yield districts’ state share percentage (SSP). A low-wealth district will have a higher SSP, and in turn receive more state aid, while a high-wealth district will receive fewer state dollars.
This type of state-share mechanism has long been part of Ohio’s formula and is critical for an equitable system. Yet Cupp-Patterson takes a problematic approach to calculating the measure. The challenge relates to property and income inflation.[4] To illustrate, consider data from Columbus City Schools, whose property values and incomes have risen quite quickly in recent years. From FY22 to FY23, inflation cut the district’s SSP from 37 to 29 percent. In other words, the district suddenly appeared to be much “wealthier”—not necessarily due to greater prosperity, but due to systemic price increase in the city. The lower SSP led to a drop in base state funding of roughly $500 per pupil in just one year ($2,656 to $2,112 per pupil). Updating the inputs during the last state budget cycle boosted the base amount in FY24, which in turn lifted the district’s SSP.[5] But inflation has again eroded the district’s SSP between FY24 and FY25 and reduced its base per-pupil funding by about $700. As the bottom rows of Table 1 indicate, this issue isn’t limited to Columbus. The vast majority of districts have experienced decreases in their SSPs in the second half of the biennium, as inflation takes a toll.
Table 1: Local wealth measures, SSP, and base per-pupil amounts, Columbus City Schools
The design of the Cupp-Patterson state-share mechanism puts lawmakers in a dilemma. Absent consistent updates to the inputs, general inflation in local wealth systematically erodes districts’ SSPs over time and reduces state funding. But as noted above, updating inputs has been a costly proposition for the state, even with local wealth inflating at a rapid pace. Overall, Cupp-Patterson is so tightly wound that if one part of the system isn’t functioning (for instance, if input updates aren’t made), then other parts can break (SSPs drop).
How to unravel this is no easy task. One possibility is to revisit the Kasich-era approach to the state-share mechanism, which seemed to better handle inflation by anchoring districts’ state shares to the statewide average. Instead of inflation lowering SSPs systemwide, this “relative” approach to gauging local capacity meant that roughly half of districts had increasing state shares, while the other half experienced decreasing state shares.
Issue 3: Small districts’ excessive base funding amounts
Under the Cupp-Patterson formula, districts receive variable base per-pupil amounts; there is no single number that applies across-the-board statewide. While most districts’ base amounts fall within a narrow band around the statewide average, low-enrollment districts enjoy substantially higher base amounts. This happens because of “staffing minimums” tucked inside the Cupp-Patterson base-cost model. In a vast majority of districts, these minimums don’t apply, and their amounts are determined by simply following prescribed staff-to-student ratios.[6] Small districts, however, have their base amounts computed based on a minimum number of staff, even if their actual ratios yield smaller numbers than the minimums.
Table 2 shows the impact of these minimums on base amounts. The five lowest-enrollment districts have per-pupil amounts that are far higher the statewide average, with tiny Vanlue school district receiving more than twice the statewide average. On average, the 100 lowest-enrollment districts enjoy base amounts that are just over 20 percent above the state average.[7] Does it truly cost that much more to educate a typical student in a low-enrollment district? Cupp-Patterson proponents argued that they incur higher costs because of diseconomies of scale. But at the same time, small districts are mostly located in rural communities where the cost of living is likely far less than metropolitan areas. That should decrease the cost of education or at least offset some of the impact of diseconomies.
Table 2: Small districts’ base funding amounts, FY25
Providing some supplemental aid for sparsely populated districts may be warranted, but this level of extra subsidy—hidden inside the base-cost formula—is not a fair or transparent way of allocating funds. Because these minimums essentially reward low enrollment, they also discourage smaller districts from expanding (perhaps through interdistrict open enrollment) or consolidating with a nearby district, as they risk losing state dollars by growing in size. Finally, while small districts benefit from this subsidy, public charter and STEM schools—most of which have enrollments under 1,000 students—do not receive staffing minimums in their base-cost calculation. How is that fair?
* * *
School funding formulas are understandably complicated. But the level of complexity and intricacy in the Cupp-Patterson plan puts school funding on increasingly shaky ground. The large expense of maintaining the plan adds to the uncertainty, as legislators may or may not be willing to continue pushing money through the formula to keep it working. And unfortunately, these aren’t the only school funding issues that lawmakers have to worry about. Stay tuned for a follow-up piece, where we’ll examine three more.
[1] Other inputs include—among other items—average retirement and health-care benefit costs, support and administrative staff salaries, and building operation expenditures.
[2] The base per-pupil amount works with the state share percentage (SSP) to determine the bulk of districts’ state funding. The base amount and SSP also play a significant role in determining the “weighted” categorical funding that districts receive for career-technical education, special education, and English learners.
[3] Initial implementation of Cupp-Patterson (FY22 and FY23) relied on FY18 input data; in 2023, lawmakers approved an update to FY22 data for FY24 and FY25. Another update to the inputs in 2025 would presumably use FY24 data for FY26 and FY27.
[4] Inflation here can refer to growth in “real” income or property valuation (e.g., real estate development) or to growth related to general increases in prices and wages.
[5] This occurred because the increase in the base amount from FY23 to FY24 outpaced the increase in the local wealth measures (i.e., “local capacity” amount).
[6] These ratios determine, for instance, how many teachers a district is funded for in the base-cost model. The number of prescribed teachers is multiplied by the average teacher salary and benefits to yield a dollar amount. These ratios do not determine the actual number of teachers or staff members a district must employ, but are used solely for the purposes of funding calculations.
[7] Above roughly 700 students, the base amounts move closer to the state average, as the minimums no longer apply.
In a recent post, I examined three issues with Ohio’s school funding formula—a.k.a., the Cupp-Patterson plan—that Ohio lawmakers should focus on as state budget talks begin early next year. In this piece, we’ll examine three more formula elements that require attention.
Issue 4: Continued use of guarantees
Guarantees are excess dollars that shield districts from funding reductions otherwise prescribed by the formula. They typically come into play when districts are losing students and/or becoming wealthier factors that should lead to a reduced need for state aid under a fair funding formula. But instead of funding schools according to the formula, guarantees provide excess dollars to districts based on their historical funding levels and are often used as a political compromise to shield districts from cuts. At a recent committee hearing, Representative Beryl Piccolantonio noted that “in the final phase-in of the Cupp-Patterson plan, there are not supposed to be guarantees.” She’s right. The plan was routinely billed as one that would finally move Ohio away from guarantees, which were part of the “patchwork” formula that proponents of Cupp-Patterson—and others, including us at Fordham—had long decried.
And yet, they have not disappeared. Four years into implementation of the Cupp-Patterson plan, almost two in five districts (240 out of 611) are receiving extra funding via one of the three main guarantees this year,[1] which collectively cost the state $379 million. As table 1 shows, the number of districts on guarantees, and the amount spent, actually went up this year compared to last. If one goes back further to the Kasich-era formula, Ohio is now spending two tothree times more on guarantees than it used to (see figure 7 of this report for FYs 2014–17 guarantees). Rather than solving the guarantee problem, we’ve moved backward.
Table 1: Guarantee funding in FY24 and FY25
East Cleveland is by far the largest beneficiary of guarantees, receiving a stunning $8,517 per pupil this year in guarantee funding alone. This largesse sits atop the district’s general formula funding, which amounts to $17,267 per pupil—already multiples above what a typical district receives in state aid. Why is East Cleveland so heavily subsidized? One clue is their enrollments, which have been in freefall over the past few years. In FY20, the district enrolled 1,770 students, but that number was down to just 1,114 students in FY25, a drop of 37 percent. Guarantees have shielded East Cleveland—and other shrinking districts—from reductions that should otherwise occur due to enrollment loss.
Guarantees short-circuit the formula, and lead to unfair and inefficient allocations of state aid. They also consume dollars that could be used to boost funding in other critical areas of education, like funding for low-income students or bold statewide initiatives such as literacy or (hopefully, soon) numeracy reforms. To their credit, proponents of Cupp-Patterson were concerned about guarantees, and suggested they would eventually vanish. In fact, two of the guarantees are actually called “transitional.” In the next budget, lawmakers should complete the “transition” and remove guarantees and move all districts onto the formula. That’s the truly fair way to fund schools.
Issue 5: The DPIA mess
At a press conference unveiling his school funding proposal in 2019, Representative John Patterson admitted that “we have been unable to fully define what ‘economically disadvantaged’ is.” He seemed to be alluding to challenges that Ohio has had in identifying low-income students. Most notably, this includes the problem of relying on free-and-reduced-price meals eligibility as a marker for student poverty. As discussed on this blog before, Ohio began implementation of the Community Eligibility Program (CEP) about a decade ago. This federal initiative allows mid- and high-poverty districts and schools to provide subsidized meals to all students. While this makes sense from a logistics and nutrition perspective, CEP has thrown a wrench in Ohio’s data on economically disadvantaged students. During the 2023–24 school year, sixty-nine districts reported universal disadvantaged rates (>95 percent) in addition to 1,020 individual schools (about one in three). But notevery single student attending these districts and schools was actually low-income. Some were actually from higher-income households, and because they received subsidized meals by virtue of their schools’ participation in CEP, they were flagged as disadvantaged.
These identification problems have become a major school funding problem. Economically disadvantaged enrollments are used to determine districts’ disadvantaged pupil impact aid (DPIA). This funding stream is supposed to target extra state aid to meet the needs of low-income students. But due to the inflated headcounts, DPIA no longer effectively drives dollars to the highest-need schools. Some mid-poverty districts—those with a mix of students from varying economic backgrounds—are now receiving just as much DPIA funding as districts with more concentrated poverty. For instance, Euclid school district near Cleveland receives the same DPIA per pupil as the Cleveland school district because both participate in CEP and report 100 percent economically disadvantaged rates. Yet we know from Census data,[2] as well as pre-CEP data, that Euclid enrolls fewer low-income students than Cleveland.
Yet another issue that’s surfacing as CEP continues to expand across the state[3] is that high-poverty districts are now seeing their DPIA funding cut. It’s easiest to see using Cleveland as an illustration. Table 2 shows that the state average disadvantaged rate has moved upward, as more schools participate in CEP and report 100 percent rates. This causes the economically disadvantaged “indexes” of high-poverty districts to fall. Because these indexes are critical to determining funding, the actual DPIA amounts fall with it. Cleveland, for instance, has seen its DPIA funding cut by $249 per pupil within the past year. This hasn’t gone noticed by Cleveland administrators, and is likely to get the attention of other high-poverty urban districts and charter schools that face a similar situation.
Table 2: Cleveland Metropolitan School District’s DPIA index and funding amount, FY24 and FY25
The next General Assembly needs to clean up the DPIA mess. To do this, lawmakers must do two things. First, they should require DEW to identify low-income students based on direct certification, a process that’s already in use and identifies disadvantaged pupils through their families’ participation in SNAP, TANF, or Medicaid. Second, they should use direct-certification numbers to allocate DPIA funds, instead of the inflated economically disadvantaged rates based on meals eligibility. These steps would strengthen the equity of Ohio’s funding system by ensuring that low-income students are accurately identified and receive the resources they need from the state.
Issue 6: Weakened incentives for interdistrict open enrollment
Ohio has long had an interdistrict open-enrollment policy that allows students to attend district schools outside of the district where they live. Students benefit from this option, as they can attend public schools with programs and courses that may better fit their needs and interests or that aren’t offered by their home district. Other students may choose to attend school near extended family, or even one that is located closer to their home if they live near a district boundary. Fordham research has found that students who participate in open enrollment make academic gains, especially minority pupils who leverage the program to access higher-quality schools.
While most districts participate in open enrollment, they are not required to do so under law—and, alas, many suburban districts do not open their doors to non-residents. In such a voluntary program, the financial incentive to participate in open enrollment matters greatly. Under the state’s previous funding model, districts received $6,020 per open-enrollment student from the state.[4] That system made crystal clear the financial incentive and the amount likely covered the incremental cost of serving additional students.
Cupp-Patterson, however, changed course on open-enrollment funding. Instead of providing a simple, flat amount that applies across the state, the model now funds open enrollees at the state share of the base per-pupil amount of their district of attendance. Since that amount varies widely across districts and can only be found by digging deep into state funding reports,[5] this change muddies the incentive to allow open enrollment. Worse yet, this approach results in declining per-pupil funding for open enrollees in many districts, particularly higher-performing (often wealthier) ones where students stand to gain the most if they attend. The reason for the slippage is that state share percentages (SSPs) are now applied to the base amount.
Table 3 shows how this plays out using Franklin and Scioto County districts as illustrations. Under the old formula, all six of these districts would receive $6,020 per open enrollee. Now, however, they receive variable amounts. In the wealthier suburban Columbus districts, these amounts are much lower than under Cupp-Patterson, which offers them almost no incentive to participate in open enrollment. Rural Scioto County districts are less impacted by the change—they are less wealthy and receive higher amounts of state aid—but they still receive somewhat reduced open-enrollment funding. Remember, these comparisons are against a FY21 baseline; had the old formula been continued, the base amount would almost surely be higher today.
Table 3: Open-enrollment funding under the old formula (FY21) and Cupp-Patterson (FY25)
There is anecdotal evidence that somedistricts are responding to these weakened incentives by curtailing open enrollment. As a result, participation has slid over the past four years. In FY22, the state reported 79,823 open enrollees, but in FY25, just 74,557 students are in the program. This decline comes in the wake of years of consistent growth in open enrollment under the prior funding model.
State lawmakers should address open-enrollment funding to keep this option on the table for Ohio students. They have a couple policy options that would fix the problem. First—and more ideally—they could once again use the full base amount to fund open-enrollment students ($8,242 per pupil is the statewide average this year). This would make clear the incentive for open enrollment and increase funding amounts. Alternatively, legislators could use the higher of the student’s home district or district where the student attends state share of the base amount for open enrollees. While more complicated, this approach would ensure that students from, say Cleveland or Akron, aren’t funded at small amounts when they open enroll at a suburban district.
***
With much fanfare, Ohio lawmakers enacted a new school funding formula in summer 2021. The Cupp-Patterson plan certainly has its strengths. But as it has been implemented, problems have also been revealed. With any luck, lawmakers will make fixes in the next state budget that move Ohio towards a system that actually lives up to its popular name and is a “fair school funding model.”
[2]SAIPE Census data from 2022, which estimates the percentage of school-aged children at or below the federal poverty level, indicates that Euclid has a poverty rate of 31 percent versus Cleveland’s rate of 41 percent.
[3] This is happening as the result of a very recent federal policy change that has lowered the bar for CEP eligibility.
[4] This is the FY21 amount, the last year the old formula was in use. If applicable, districts also received CTE weighted funding and could also seek additional funds from a student’s home district to cover special-education costs at the end of the year.
[5] This is even more complicated than it sounds, as districts have not been receiving the actual base amounts prescribed by the formula during the phase-in of Cupp-Patterson.
In 2023, officials at Ohio’s Department of Education and Workforce (DEW) published data indicating that fewer young people are entering the teaching profession and that teacher attrition rates have risen. These numbers appeared to corroborate anecdotal reports from district and school administrators of teacher shortages. However, because Ohio doesn’t collect data on teacher vacancies, it’s nearly impossible to determine the size and scope of shortages—not just in terms of geographic regions and schools, but also subjects and grade levels.
To their credit, Ohio policymakers have made addressing teacher shortages a priority. Their efforts to bolster recruitment and retention include investing in Grow Your Own programs, establishing a teacher apprenticeship program, and streamlining licensure laws. These are all positive steps forward. But the dearth of information on teacher vacancies remains a problem. Without consistent and accurate data, lawmakers will have difficulty crafting effective policy solutions.
The good news is there is a solution. House Bill 563, which was introduced in May, proposes a method for collecting and analyzing data on teacher vacancies that could provide state and local leaders with the information they need to tackle shortages. The bill also contains provisions aimed at supporting student teachers, which we’ll discuss in a later piece. But for now, let’s focus on how the bill proposes tracking teacher vacancies.
HB 563 requires DEW to develop and administer an annual online staffing survey. This survey would collect data from districts on teacher vacancies based on staffing numbers from the first day of school. DEW would be required to use the results of these surveys to produce and submit an annual report to the General Assembly and the Department of Higher Education. The report must also be published on DEW’s website along with a summary of each district’s survey. As for the surveys themselves, the bill requires them to request a considerable amount of information from districts. That information includes the following:
The number of each of the following positions that are vacant or filled by an individual who is not fully licensed for the position:
Teachers, categorized by required license and endorsement areas
School psychologists
Speech-language pathologists
Occupational therapists
School counselors
School social workers
School nurses
Other positions determined by DEW
The number of teaching positions filled by long-term substitutes, categorized by required license and endorsement area.
The number of teaching positions filled by retired teachers who renewed an expired license or returned to the classroom under a permanent teaching certificate, categorized by required license and endorsement area.
The number of positions filled by teachers who hold an alternative license, categorized by required license and endorsement area.
The number of new teachers, speech-language pathologists, occupational therapists, and school psychologists, counselors, social workers, and nurses.
Understanding the depth and breadth of teacher shortages is crucial for Ohio leaders. This survey and the accompanying annual review would make that possible. But there are also a few tweaks that could make the survey and review even more useful. With that in mind, lawmakers should consider the following ideas.
1. Limit survey requirements to lessen the paperwork burden. To properly address teacher shortages, policymakers need detailed data. But they also need to ensure that data collection efforts don’t place too hefty of a paperwork burden on schools. One way to walk this line is to limit the survey to two requirements: identifying the number of positions that are vacant or filled by an individual who isn’t fully licensed, and identifying the number of teaching positions filled by long-term substitutes. This would provide policymakers with the data they need to address shortages without bogging down districts in data collection.
2. Ensure that the survey collects data from all public schools. That includes charter schools, STEM schools, and JVSDs. It’s important for state leaders to have data from all of Ohio’s public school options to ensure that every student has access to a permanent and effective teacher rather than a carousel of substitutes.
3. Require the survey results to be disaggregated at the school level. This is particularly crucial in big districts like Columbus and Cleveland, where shortages might be concentrated in specific schools or neighborhoods rather than across the entire district. Pinpointing exactly where shortages exist is the only way to effectively address them.
4. Add principals to the list of vacancy positions. It’s a far less discussed problem, but plenty of schools have trouble finding principals and administrators. Requiring superintendents to identify principal vacancies could help shed light on how big of a problem this is, and would provide state leaders with the data they need to craft policy solutions.
5. Require superintendents to identify courses or subject areas that have been eliminated due to extended vacancies or hiring difficulties. One of the problems with persistent vacancies is that they can force schools to eliminate courses they can’t find a teacher for. That, in turn, limits opportunities for students. By asking schools to identify when this happens and in which subject and grade levels, state and local leaders would have additional data about the impact of shortages. They would also have important evidence of student demand, which—if it’s significant enough—could prompt innovative policy solutions like course access.
6. Include data on teacher supply in the annually published report. To understand the current status of the teacher pipeline, and to determine the full size and scope of teacher shortages, state and local leaders need data on teacher demand and supply. The annual survey will provide details on teacher demand, but it’s important for the state to also include data on teacher supply in its annual report. That way, policymakers and advocates have a holistic picture of the teacher pipeline. Supply data should include the current number of students enrolled in teacher preparation programs (TPPs), both traditional and alternative, disaggregated by program; the most recent number of graduates from TPPs, again disaggregated by program; the current number of teachers with active licenses, disaggregated by license and endorsement area; and the number of teacher candidates who passed licensure exams during the most recent academic year, disaggregated by exam.
7. Require DEW to develop a real-time database of vacancy data. Policymakers need teacher vacancy data as soon as possible. This survey will meet that need. But it shouldn’t be considered Ohio’s final destination. State officials should identify a more efficient data collection method, like hiring a contractor to scrape district websites for job postings, and establish a publicly-available database that uses that method. This would eliminate the paperwork burden for districts and provide policymakers with even more detailed information.
***
Teacher shortages are a persistent problem. There are plenty of reasons why they’re difficult to address, but chief among them is that state and local leaders don’t have access to detailed and consistent data on vacancies. Without this information, it’s difficult to identify the regions, subjects, and grade levels where shortages exist and respond accordingly. The staffing survey and report called for in HB 563 would help solve this problem. And while HB 563 has run out of time to pass in the General Assembly, its data elements provide a strong starting point for addressing a critical need. Lawmakers should revisit it when they return in January.
Dual enrollment, which allows students to complete college coursework and earn transferrable credits while still enrolled in high school, is a popular and growing pathway into postsecondary education. For many students, it also offers a way to take more advanced courses during high school and is often seen as an alternative to longstanding means such as Advance Placement and International Baccalaureate. Nearly every state now has formal programs for facilitating dual enrollment (DE), and the number of dual enrollees has tripled in the past twenty years. However, research lags on the effectiveness of DE in getting students to and through college. A recent report tries to bolster the research base.
A team of analysts from the Community College Research Center at Columbia University looks at national and state-by-state data on the postsecondary enrollment and degree completion outcomes of three groups of students who first enrolled in college in fall 2015. They are: DE students—high schoolers who enrolled at a postsecondary institution for the first time in fall of 2015 in order to take college courses while still attending high school; prior dual enrollment (PDE) students—new college freshmen who enrolled in fall 2015 after their high school graduation and had records of postsecondary enrollment while in high school; and non-dual-enrollment (non-DE) students—new college freshmen who enrolled after high school graduation but had no prior DE experience. The analysts use National Student Clearinghouse (NSC) data for most demographic and postsecondary institutional data and include neighborhood income data from the U.S. Census Bureau to estimate family income based on home addresses.
The sample comprised 3.6 million students from across the country. In the fall of 2015, 62 percent were non-DE students, 25 percent were PDE students, and 13 percent were DE students. All students were followed for four years after high school graduation, which means DE students were followed for a longer period of time than their peers. While the findings are largely just a snapshot of the college-going behaviors of each group—rather than any kind of controlled trial to determine causation—the report does offer some interesting comparisons.
Overall, the analysts find confirmation of widespread DE participation across states. However, just ten states (including California, New York, Washington, and Fordham’s home state of Ohio, among others) accounted for over half of all new DE students nationally in fall 2015. Texas was the top dog, with nearly twice as many students as the second-place state, New York. Community colleges enrolled 1.6 million students in the sample, with 60 percent of those students having PDE experience or being current DE students. By contrast, only 20 percent of the 2 million students who enrolled at four-year colleges had current or prior DE participation. DE students were highly likely to re-enroll in college for at least one semester after high school graduation, with 81 percent of those doing so within a year. In forty-one states, the re-enrollment rate for DE students exceeded 33 percent. In the top states, it was nearly half. The majority of re-enrollers (51 percent) opted for a four-year institution.
The remainder of the detailed analyses focus specifically on the DE students in the sample, with no further comparisons to the other groups included. Some examples: Forty-two percent of DE students went on to complete a college degree or certification within four years of finishing high school. Twenty-nine percent completed a bachelor’s degree, 10 percent completed an associate degree, and 2 percent completed a short-term certificate. Four years after high school, almost a third of DE students were still enrolled in college but had not yet earned a degree or certificate. In 41 states, DE students who enrolled in college right after high school had higher college completion rates than their non-DE peers. These outcomes were strongest in Delaware, Florida, Georgia, Mississippi, and New Jersey. Even stronger outcomes were seen for students who participated in DE at four-year institutions and who then went on to matriculate at four-year schools. However, this population was less likely to live in a lower-income neighborhood or to be Black or Hispanic than their community college DE peers.
There’s lots more to dig into here, including the state-by-state comparisons along various data points. The bottom line for these analysts is that dual enrollment in high school appears to be a reliable pathway into college for all types of students. It also appears to function as a booster toward on-time degree completion. The analysts conclude that states could increase DE access to more high schoolers in the hopes of spreading more of its benefits around. But they also note that state leaders should closely monitor their own data to make sure that completion rates are reasonable and that students are not being set up to fail (at potentially great cost). The bottom line for research, however, is that causal links are still desperately needed to really establish whether dual enrollment is truly fulfilling its promise of access to college and degree completion—not to mention the various “flavors” of dual enrollment which have varying levels of availability and outcomes, which are not differentiated here at all.
Millions of American high school students annually participate in preparatory coursework intended to build and document their readiness for college, including Advanced Placement (AP), International Baccalaureate (IB), and dual enrollment. Research generally shows a correlation between successful completion of this coursework and postsecondary success, but there are gaps in outcomes for certain students that have so far gone unexplained. A new study theorizes that student perceptions of readiness might play a role in boosting or decreasing the effects of college preparation work.
A group of researchers from Oakland University in Michigan recruited 339 first-year college students from across the country between September and November of 2022 via an online data collection platform. They focused specifically on students with an interest in STEM fields—such as healthcare, biomed, medicine, or physical, chemical, and biological sciences—due to recent increases incollege-preparatoryactivities in these fields. The majority of students in the sample (56 percent) were first-generation college students. Thirty-nine percent were White, 33 percent were Black, and 29 percent were Hispanic. Fifty-four percent were female. The vast majority (71 percent) reported a household income lower than $75,000 per year. Individuals hailed from forty states and the District of Columbia. Although participants were accepted for the study regardless of their educational experiences prior to college, as part of the demographic data gathered students were asked about participation in AP and IB programs, as well as any dual-enrollment courses taken while in high school. Treated as a unit (as this report does), 58 percent of students in the sample reported either AP or IB participation, and 60 percent reported dual-enrollment participation. These are not mutually exclusive groups, but it is not noted how many students reported participation in both types of programs.
The survey administered by the researchers also asked about students’ experiences with several specific types of extracurricular STEM programs (science fairs, Math Olympics, Science Olympics, robotic clubs or competitions, and afterschool or summer-break STEM or health-related camps), as well as more general exposure to what are termed outreachand pathway programs (OPP). These include recruiting activities designed to introduce secondary students to the STEM and/or medical fields. Open-ended survey questions included details on programs experienced, the amount of time spent pursuing them, and particulars about the activities. Most importantly, all students were asked if they felt prepared for college—a simple yes/no question.
Approximately 84 percent of students in the sample responded that they felt prepared for college. The researchers found that, while neither race, ethnicity, income, nor first-generation status correlated with those perceptions, exposure to STEM while in high school did. Specifically, student participation in OPPs and dual enrollment were predictive of higher perceptions of college readiness, as opposed to AP/IB participation or no participation at all. Their predictive validity was almost exclusively driven by students who lived in low-income zip codes.
Digging deeper into OPP participation, a far less studied area of college preparation than the others due to its generally ad hoc and extracurricular nature, 279 students reported having had at least one OPP experience during high school, with 217 (78 percent) stating that they obtained at least one skill as a result of the experience. From the detailed responses, the researchers identified 206 “skills” and mapped them to researcher David Conley’s principles of college readiness (or “other” if they did not fit the predefined categories). Content knowledge (28 percent), learning skills/techniques (23 percent), and cognitive strategies (20 percent) were the most-common skills cited by OPP participants. (If you want detail, the report includes examples of student responses.)
The bottom line is that most first-year college students felt prepared for the labor ahead, but formal programs seemed to give students less (or little) of a boost of confidence than would be expected. Mechanisms are murky, given the limits of the data, and researchers spend more time speculating why AP/IB participation isn’t predictive of feelings of preparedness (suppositions include too few courses taken and/or courses completed without taking the final test) than why dual enrollment and OPPs are predictive. Dual enrollment is generally well-studied elsewhere in this regard, but OPP participation could be a relatively-unheralded gateway to helping high schoolers feel prepared for college. This would be good news indeed because OPPs offer a wider variety of activities with less formal structures than any of the other college-preparatory coursework noted. Sororities, governmental agencies, unions, employers, community organizations, and, of course, collegesthemselves have all been active in bringing STEM (especially) outreach programming to students, generally after school or in the summer and often free of charge. This research would indicate that such efforts are having a surprisingly positive impact.
Of course, there’s no way to tell from this research whether the first-year college students were actually prepared to do well in higher education, regardless of their own perceptions thereto. Only by following them through college to observe class grades, test scores, GPA, persistence, and completion can we truly map the pathway from preparation to success.
NOTE: Today, the Ohio Senate Education Committee heard testimony on Substitute Senate Bill 295 which proposes substantive changes to the closure requirements for public schools across the state. Fordham’s Vice President for Ohio Policy presented opponent testimony to the bill. These are his written remarks.
The Fordham Institute is an education-focused nonprofit that conducts research, analysis, and policy advocacy with offices in Columbus, Dayton, and Washington, D.C. Given the subject matter of my testimony today, I also want to note that our Dayton office is a charter school sponsor. (This testimony will use the more common phrasing of “charter school” to refer to what in Ohio law is called a “community school”.)
Today’s testimony is based on Substitute SB 295 and couldn’t include any potential amendments that this committee has or will adopt.
Fordham plays an interesting role in the charter school ecosystem. We are proudly both an advocate for charter schools but also fiercely committed to the principle that all schools—including charter schools—be accountable for their academic performance. For that reason, we are opposed to the current language in Substitute Senate Bill 295.
Before explaining our concerns with the current bill language, I want to start with some things that the bill gets right. Charter schools have long had a provision in law requiring a school to be closed after three years of very low academic performance. There is a similar measure in law for traditional public schools, but there isn’t any evidence that it is enforced. This legislation would set the same standard for closure and/or intervention for both traditional and public charter schools. That’s a good thing. At some point, after years of low performance, the state has a moral obligation to push districts to improve schools.
Importantly, the identification measure proposed is an improvement over existing law and includes both achievement (Performance Index) and growth (Value-Added) components. Any school identified—traditional public or public charter—would be both very low performing and have very low levels of academic growth for three consecutive years. The law also proposes a host of thoughtful potential interventions—including but not limited to closure—for low performing district schools. This is a necessity as there are situations where a district school simply cannot close because it’s the only school in the district serving certain grade levels, and there is a constitutional responsibility to ensure that every student has access to a public school.
The substitute bill, in our view, gets those things right, and the chair deserves credit for tackling those issues. And yet, this is opponent testimony. Most of the areas of concern stem from the principle that public charter schools are and were designed to be different than district schools.
From the beginning of the charter school movement, advocates—including Fordham—have said that charters should be given more autonomy and freedom from regulation in exchange for accountability for results. We continue to believe that this exchange—autonomy for accountability—has the greatest potential to produce improved student learning outcomes. Across the nation, this model has allowed many high-performing charter networks to thrive and has driven achievement gains, particularly among low-income students.
While Ohio initially struggled with charter accountability, the state began to take this side of the bargain more seriously with the passage of an automatic charter-closure law in 2006 and the charter reforms of 2015, which put more pressure on charter-school sponsors to authorize quality schools. These accountability mechanisms have driven significant improvement in Ohio’s charter sector. Just prior to the pandemic, Fordham published a rigorous study using student-level data finding that brick and mortar charters, on average, outperformed local district schools in math and reading. Post-pandemic, we’ve continued to see evidence that site-based charters outperform district schools.
We believe that substitute SB 295 would weaken charter accountability by giving chronically low-performing schools a free pass from the state’s automatic closure law. Instead of being forced to close, low-performers would be allowed to stay in operation, though required to follow some restructuring protocols. This shouldn’t be an option for poor-performing charter schools. Unlike closing district-run schools, a charter school closure does not put at-risk the state’s constitutional responsibility to make a public school available for students anywhere in the state. Charter schools are options for families—and in many cases, extremely valuable ones—but the state does not have a duty to keep a low-performing charter school open. In fact, the state’s responsibility here is to remove a poor-performing public school alternative that is not serving children well.
While we believe there is merit to revisiting the identification criteria for poor performance, it’s worth noting that under current automatic closure law, just 14 charters (out of about 320) met the low performing criteria in either 2022-23 or 2023-24.[1] The schools on this list are having considerable challenges serving students academically, as indicated by their 1-star Achievement ratings last year[2] (meaning their students score far below the state average) and indicators that students are not making sufficient progress (low Progress and Gap Closing ratings). While none of these schools have actually been forced to close—that takes three consecutive years—the current system seems to be working as intended. It’s supposed to be a fail-safe that ensures closure, in what should be relatively rare cases when a school’s sponsor is unwilling to close a low-performing school. These numbers also track with how the automatic closure law worked in the past: Historically, about 4 charters closed per year when the automatic closure law was in effect.
As currently drafted, we have serious concerns about how the bill handles charter accountability. It can however be improved in ways that preserve and strengthen the charter accountability mechanism. We recommend the following be added to the bill.
1. Do not allow for a “restart” of the clock for charters that have met current closure provisions, as the substitute bill does. Any change in the performance criteria should apply prospectively, but it must not wipe out one or two years of poor performance. Those years mattered to the students who attended these schools, and the state shouldn’t pretend like their lack of progress didn’t happen. Any new identification criteria should acknowledge that schools on the current closure list have been struggling.
2. If a restructuring option is made available to charter schools, the following provisions should be added:
a. Require permanent closure, if a school identified for restructuring continues to meet the low-performing criteria three years after being so identified. Chronically low-performing charters should not be allowed to be in perpetual school-improvement status; at some point, the state needs to step in and close the school.
b. Base charter school sponsor evaluations half on academic performance. As noted above, the state’s automatic closure law should be a last resort. Sponsors are the entities that should actually be closing low-performing schools. However, since the 2015 charter reforms, the state has chipped away at charter sponsor accountability by allowing for an effective rating—through the paperwork-driven components of the evaluation system (quality practices and compliance elements)—even if the sponsor’s school portfolio are delivering very poor academic results. If there has to be some type of weakening of automatic closure, the state must at the same time restore and strengthen accountability for charter sponsors. This can be done by increasing the weight of the academic portion of their evaluation to 50 percent (up from 33 percent currently). The higher weight on academics would create a stronger incentive for sponsors to close low-performing schools in their portfolio.
The state’s automatic closure law and sponsor evaluations are key accountability mechanisms that ensure that the charter sector is fulfilling its promise of autonomy for accountability—and more importantly, fulfilling its promise of a high-quality education to Ohio families and students. While there are some positives in the substitute bill, as drafted, it would roll back charter accountability. In fact, it would put the sector at risk of returning to the pre-House Bill 2 charter landscape. The sector has improved greatly under a stronger accountability framework, and now is not the time to return to low expectations for Ohio’s charter schools.