This spring’s school funding debates have revolved around the needs of poor students. Governor Mike DeWine hasa significant bump in state spending targeted at low-income students. And the House budget proposal, released just last week, adds even more to this pot of funds. The focus on supporting low-income students is commendable, as they often face the greatest barriers to success in college and career.
Given all the discussion about the needs of low-income students—and all the millions in funding being directed to support their education—it might surprise you learn that Ohio faces major challenges in identifying students who are in fact low-income. In aunveiling his funding plan, Representative John Patterson said “we have been unable to fully define what ‘economically disadvantaged’ is.” Meanwhile, the DeWine plan bypasses the state’s own data on low-income pupils and instead targets aid based on federal about childhood poverty, a , since it’s not based on actual headcounts of low-income students attending schools.
Without accurate data on low-income students, Ohio cannot efficiently target resources to students that need them the most. With only a few months until the General Assembly passes the state budget, there’s little chance that policymakers will be able to implement a different approach to counting low-income kids in this budget cycle. For now, that’s okay. But a critical, longer-term project for state leaders is to devise a more reliable approach to identifying low-income students.
This piece discusses the challenges Ohio faces in counting low-income, a.k.a. “economically disadvantaged” (ED), students, in light of policy changes from Washington, and it illustrates how they have inflated poverty rates across hundreds of schools. In a follow-up piece, I’ll consider how Ohio can transition to a different method of counting low-income students—known as “direct certification”—a shift that a few other states are already undertaking.
Ohio has traditionally identified ED students based on their eligibility for free or reduced-priced lunch (FRPL). According to federal guidelines, students whose household incomes are at or below 130 percent of the federal poverty level can receive free meals at school, while those at or below 185 percent poverty are eligible for reduced price meals. For many years, FRPL counts have served as a reasonable proxy of economic disadvantage in districts and schools.
But significant changes to FRPL identification have occurred starting in 2010, when Congress enacted the(CEP). CEP allows certain high-poverty districts and schools to provide free meals to all students, regardless of their household income. To qualify for CEP, schools must have more than 40 percent of their students deemed eligible for free meals via —a process whereby low-income pupils are identified through their participation in means-tested programs like food stamps or flagged as in foster care, migrant, or homeless. Individual schools, or an entire district, may qualify for CEP based on their certification counts.
The admirable goal of CEP is to ensure that students in qualifying schools receive meals without stigma or administrative fuss. Yet because every child is able to receive free meals, CEP schools report 100 percent FRPL students—even though not everyone is actually eligible for subsidized meals. This, in turn, leads to inflated counts of ED students.
Inflated counts of low-income students
The table below highlights the predicament using five CEP schools in Ohio as examples. As you can see, all of these schools reported 100 percent ED rates in 2017–18 as a result of CEP participation. But their “true” rates of disadvantaged students are almost certainly much lower. Assuming similar enrollment patterns compared to the year prior to CEP adoption—a reasonable assumption given slightly declining—the schools’ actual ED enrollments are anywhere from 23 to 74 percentage points lower than what’s reported.
Table 1: Illustration of how the Community Eligibility Provision affects data on economically disadvantaged students
Statistical imprecision might be of minimal concern if CEP schools were few and far between. But as Figure 1 shows, a sizeable portion of Ohio schools participate—nearly 1,000 out of roughly 3,300 public schools statewide. These schools are spread across Ohio’s less affluent cities, inner-ring suburbs, small towns, and rural areas. In fact, ninety-two of Ohio’s 610 districts have at least one CEP participating school, though a disproportionate number are located in big-city districts like Cleveland and Columbus.
Figure 1: Number of schools participating in the Community Eligibility Provision
Source:. Note: This figure includes any institution that participates in CEP. The vast majority of the CEP schools are district or charter, with a handful—about fifty per year—being nonpublic schools or programs operated by regional ESCs, career-tech centers, or county boards of developmental disabilities. The first year in which ODE records CEP participation was 2012–13.
To get a rough sense of “miscounted” students statewide, I estimate the total number of non-ED students who are deemed ED by virtue of CEP. To do this, the 2011–12 ED rates—the year just before any Ohio school participated in CEP—serve as a proxy for schools’ “true” ED rates. Given the proximity of 2011–12 to the Great Recession, those data are likely higher than the ED rates today and produces a conservative estimate of the impact of CEP.
Assuming the FY 2012 ED rate more reliably captures current student poverty rates, ED enrollment in CEP schools is thus inflated by about 65,000 students—about the size of Cincinnati and Toledo districts combined. This has ramifications for state funding: Under the, higher poverty districts receive about $800 to $1,000 in additional funding per ED pupil. If CEP inflates ED counts by about 65,000 students, roughly $50 to $65 million per year is now being misdirected.
Table 2: Estimating how many non-disadvantaged students attend CEP schools based on FY 2012 data
Notes: Almost all CEP schools’ ED rates are reported as 100 percent (in ODE data as “>95”, which I impute as 100 percent), though forty-eight schools had rates below 95 percent, mostly between 90 and 95. A total of 837 schools are included in this analysis—161 CEP schools are excluded due to not being district or charter schools, not having ED data from FY 2012, or not having pupil enrollment data in FY 2018. The average ED rates across CEP schools for FY 2018 and 2012 are weighted by each school’s FY 2018 total enrollments.
* * *
Analysts like the Urban Institute’sand our own have raised concerns about the impact of unreliable poverty data on research and accountability systems. CEP’s effects are also felt in the realm of school funding. As state legislators seek to effectively target resources to students who need them most, they need a more reliable method of counting poor students. As we at and have suggested, a promising way forward is to use direct certification data to track low-income students. Such a move would be fraught with challenges, but fortunately other states are also tackling the problem so Ohio wouldn’t be going it alone.
In a follow-up piece, I’ll consider how state policymakers could carefully make this transition in the years ahead. Stay tuned.