A Formula That Works: Five ways to strengthen school funding in Ohio
By Jennifer O’Neal Schiess, Max Marchitello, and Juliet Squire
By Jennifer O’Neal Schiess, Max Marchitello, and Juliet Squire
Back in 2014, the passage of House Bill 487 ushered in major changes to Ohio education policy, including new high school graduation requirements for 2018 and beyond. Among the new provisions was a requirement that all juniors take a college-admissions exam. Previously, only those students and families considering going to college forked over the money to take a test designed to measure college readiness. Starting this spring, however, Ohio joins several other states in requiring 11th graders to take either the ACT or SAT (it’s up to districts to choose which one to administer). To offset the mandate’s expense, the state will pick up the tab on testing costs.
Despite recent calls for the Ohio Department of Education (ODE) to reduce state testing, there’s been little pushback about requiring 11th graders to take a college admission exam, probably because the results won’t be a significant part of the state accountability system. It could also be because folks have bigger fish to fry when it comes to fighting the new graduation requirements. Regardless, the statewide administration requirement, which some students have already started taking, is good education policy. Here are a few reasons why having juniors take the ACT or SAT is a good idea.
It’s important to note that although the state will pay for only one administration for each student, low-income students can still go through their high school counselor to access fee waivers to take both tests. This is good news—it means that like their more affluent peers, low-income students will still have the opportunity to take college admissions tests multiple times. Furthermore, although waivers have long been available to low-income students, the process of obtaining one, or simply being unaware such waivers even existed, could have prevented many low-income students from signing up to take the test. Statewide administration ensures that everyone will have the opportunity to take the exam at least once.
As with any policy, there’s always room for improvement. Right now, the state-funded administration doesn’t include the writing component for either test. This is worrisome not just because writing is perhaps the most important skill needed for college success, but also because some schools include the writing portion as part of their application requirements. This means that students who otherwise would have taken the test only once must take it again in order to complete the writing portion. That seems incredibly wasteful in an era where we are debating over-testing. The benefits of statewide administration—greater awareness for students and more and better data—are lessened by the fact that the state doesn’t fund the writing component.
It’s imperative that policy makers move quickly to fund the full ACT and SAT assessments, not just parts of them. But in the meantime, Ohio has finally joined the ranks of 24 other states that are implementing this important policy. Statewide administration opens doors for all students, including traditionally underrepresented groups and those that may doubt their potential—and that’s definitely worth celebrating.
School funding debates are as predictable as the seasons, and right on cue, the release of Governor John Kasich’s biennial budget has precipitated hand-wringing from various corners of Ohio. Why? Like many other states, Ohio’s budget is tightening and his plan would reduce the amount of state aid for dozens of districts that have been consistently losing student enrollment.
No public entity anywhere has ever been happy about receiving less money than the year before; every elected leader worth their salt is going to fight for more resources for their own constituents. The challenge ahead for thoughtful policy makers is to distinguish the typical bellyaching from legitimate and serious problems in Ohio’s school funding policies.
To help, we are pleased to present this analysis of Ohio’s school finance policies. It gets under the hood of the Buckeye State’s education funding formula and tax policies and seeks to understand how well they promote two essential values: Fairness and efficiency. Why these two? Consider:
To offer an independent, critical review of Ohio’s funding policies in light of these concerns, we turned to Andy Smarick, formerly at Bellwether Education Partners and now at the American Enterprise Institute.
In 2014, we teamed up with Andy in a successful review of Ohio charter-school policies; we were exceptionally pleased when he accepted the challenge of analyzing our home state’s school-funding system. He enlisted Bellwether’s Jennifer O’Neal Schiess, who spent a decade working with the Texas legislature on school finance and education policy to lead the research efforts along with her colleagues Max Marchitello and Juliet Squire.
As readers will see, Ohio’s present approach has several strengths, including its ability to drive more state aid to more disadvantaged districts—via the State Share Index—and the added dollars for students with greater educational needs (e.g., pupils with disabilities or English language learners). Yet Bellwether also explains several elements of the present system that subvert its fairness and efficiency.
Three issues are particularly worrisome:
These recommendations, along with a couple of others discussed in the paper, would greatly improve Ohio’s school finance system and drive limited state dollars to where they’re most needed. We urge that this be done.
Much work remains to be accomplished if Ohio is to craft a transparent, modern school-funding structure. We realize that the profound complexities and political realities of school funding policy make this a daunting task. In our view, the best course forward is to take one manageable step at a time. If state leaders make these essential repairs, Ohio will take its next step in the long journey toward a school funding system that supports an excellent education for all.
You can download the full report - A Formula that Works: Five ways to strengthen school funding in Ohio - here.
[1] This paper doesn’t touch on policies and practices that can promote the productive use of school funds at a local level. For Fordham policy briefs on this issue, see for example, Stretching the School Dollar and Getting Out of the Way.
With a $20 billion federal educational choice program now a real possibility under the Trump Administration and Republican-led Congress, the media spotlight has turned to the voucher research. The discussion often revolves around the question of participant effects—whether students are better off when they use a voucher to transfer to a private school. In recent days, voucher naysayers have pointed to the negative participant findings from recent studies in Louisiana and Ohio in order to attack the idea. (I oversaw the latter study as the Thomas B. Fordham Institute’s Ohio research director.)
These cursory analyses are misleading for a number of reasons. The Ohio study, led by respected Northwestern University professor David Figlio, came with a number of caveats that are often glossed over. Figlio was only able to credibly examine a small sample of voucher participants. To do an apples-to-apples comparison using a “regression discontinuity” approach, he had to focus on voucher students who came from marginally higher performing public schools (akin to a “D” rated school). As a result, voucher participants who left the most troubled public schools in the state—the “Fs”—were not studied. It’s possible that these students benefited from the program (or perhaps not), but there was no trustworthy way to find out.
In addition, the Ohio analysis uses state test scores, which are “high stakes” for public schools but not for private ones. Thus, public school students might have been encouraged to try harder on these tests than their voucher counterparts. Had evaluators been able to use a more neutral test, like the SAT-9, it’s possible that voucher student performance would have looked more impressive. Earlier studies, which found significant positive effects for voucher participants, used such neutral tests.
Meanwhile in Louisiana, State Superintendent John White notes that the implementation of his state’s voucher program is still in its infancy—just a few years in. White goes on to explain how private schools needed time to adjust to the new program. Among them include: adapting instruction to different expectations, ensuring academic supports for new pupils, and securing the talent needed to staff an excellent school. Though still in negative territory, voucher students’ test scores were on the upswing in year two, with new data on the horizon. At the very least, it’s premature to render a clear verdict in Louisiana based on just a couple years of early results.
Skeptics, however, make a more serious error when they omit the competitive effects of school choice. This piece of the research puzzle examines whether the introduction of vouchers leads to higher outcomes for pupils remaining in public schools. Stiffer competition, so the theory goes, should nudge improvements in district-run schools, which traditionally enjoy monopolies over the delivery of K–12 education.
In Ohio, the findings were positive: The introduction of voucher competition modestly improved the outcomes of students who remained in their public schools—in the range of one-eighth of the magnitude of the black-white test-score gap. In Louisiana, Anna Egalite of North Carolina State found similar results. Though some of her estimates were null, she found positive test score effects for students attending public schools facing the strongest voucher competition.
It’s hardly surprising to see anti-voucher—and often pro-union—pundits skip the research on competitive effects. It undermines one of their major charges against vouchers: That they harm public-school pupils “left behind” because of a loss of funds. But as the Ohio and Louisiana studies indicate, the research lends little credence to this line of thought. Public schools students aren’t harmed; in fact, we find evidence that they reap academic benefits due to competition.
In heated debates like those over vouchers, solid empirical research remains an important guide. As opponents assert, the participant results from Louisiana and Ohio—caveats and all—are troubling and point to the need for improvements to existing choice programs. But they are wrong to suppress from public view studies on voucher competition simply because the findings don’t match their policy agenda. As state and federal policy makers consider private-school choice programs, they should heed research on both participant and competitive effects.
A recent High Flyer post made a strong case for how acceleration can benefit high-ability students and help administrators and teachers more effectively address the individual needs of their unique learners. It echoes findings in dozens of previous studies that show that acceleration works.
Despite mountains of evidence demonstrating its benefits, most decisions about acceleration policies are made locally. According to a recent report by the Jack Kent Cooke Foundation, forty-one states either do not have acceleration policies or permit school districts to decide whether to institute them.
Using Illinois as a case-study, the Illinois Association for Gifted Children and the Untapped Potential Project recently published a report that sought to determine whether districts step up to the plate in terms of establishing acceleration policies to support their high achievers in the absence of a state requirement. Unfortunately, the report’s findings are disappointing. Among Illinois school districts, large percentages lack policies that permit students to do the following:
These troubling statistics are compounded by the fact that 33 percent of Illinois students already meet or exceed grade-level proficiency on the state exam, with 36 percent proficient or higher in English language arts and 31 percent proficient or higher in math. When a state does not provide for high-ability students in education policy, attention and resources can get directed largely to students below the proficiency bar, resulting in the dismantling of enrichment and gifted programming. Since No Child Left Behind and the end of state funding for gifted programs in 2003, the number of Illinois districts providing gifted programming has plummeted from over 80 percent in 2003 to only 27 percent in 2016.
While more affluent families may be able to switch districts or provide supplemental enrichment outside of school in the absence of gifted programming and appropriate opportunities for acceleration, parents of high-ability low-income students often lack those options. They depend on public schools to identify and cultivate their children's talent, and this should be a priority of our education system as well.
Soon, members of the Illinois Senate Education Committee will have an opportunity to decide whether students throughout the state have access to proven acceleration practices. They will be considering Senate Bill 1223—the Accelerated Placement Act—which would establish a statewide acceleration policy grounded in best practices in Illinois.
It mirrors Ohio’s law, which requires each district to have an acceleration policy, form an acceleration committee to ensure that one gatekeeper cannot prevent students from being accelerated, and use a peer-reviewed assessment mechanism to determine whether a student should be accelerated.
Gifted education advocates take note. If you live in one of the twenty-two states without acceleration policies or one of the nineteen states that, like Illinois, allow the existence of these policies to be determined at the local level, it is likely that many students in your school district are not getting the education they deserve.
Consider pushing for a statewide acceleration policy. Acceleration is a well-researched and cost-effective way for schools to provide students with the level of challenge needed to reach their potential, and it is the least a state can do for its high-ability students whose educational needs are so often overlooked.
Josh Dwyer is the Policy Director for the Untapped Potential Project. Carolyn E. Welch, J.D., is an education attorney, Officer and Trustee of the Midwest Center for the Gifted, Board member of pilotED schools, and a member of the Parent Editorial Content Advisory Board of the National Association for Gifted Children and the State Initiatives Committee of the Illinois Association for Gifted Children.
The views expressed herein represent the opinions of the author and not necessarily the Thomas B. Fordham Institute.
Ohio’s Gap Closing report card component reports how students in certain subgroups perform on state tests and their schools’ graduation rates compared to the collective performance of all students in the state. The subgroups include racial/ethnic groups, students with disabilities, and economically disadvantaged pupils. Gap Closing is one of six major report card components and makes up 15 percent of a school district’s rating in Ohio’s current summative grading formula, set to officially begin in 2017-18.
Currently, Gap Closing compares subgroup proficiency on state assessments and graduation rates to a set, statewide standard—also known as an Annual Measureable Objective (AMO). These objectives rise gradually over time, heightening expectations for subgroup performance. When a school’s subgroup meets the AMO, the school receives the full allotment of points (“full credit”). When the subgroup fails to meet the objective, the school receives no credit—unless it makes improvements relative to the prior year. In such cases, the state awards partial credit. Those points are tallied across subgroups and divided by the points possible to compute a component grade reported on an A-F scale. In certain circumstances, schools’ Gap Closing letter grade could be demoted (e.g., A drops to a B).
Without a doubt, Gap Closing is a complicated report card element—it assesses the performance of ten distinct subgroups based on several different measures. In fact, one of us has suggested scrapping it altogether and starting over. Meanwhile, the Ohio Department of Education’s (ODE) ESSA feedback process yielded suggestions to rework the component—it was panned for not providing enough credit for the progress of students falling short of proficiency—and many Ohioans deem Gap Closing to be the “least useful” report card measure. In response to the feedback—and to some new federal requirements—Ohio’s draft ESSA plan proposes some important changes to the component. Let’s take a look.
First, ODE would gauge subgroup achievement using the performance index instead of raw proficiency rates. Most readers are probably familiar with the performance index—it looks at achievement at multiple performance levels, such as proficient and advanced—as it has been used in overall school accountability for many years (just not to gauge subgroup achievement). Using the performance index instead of proficiency rates for subgroups is a good idea since it encourages schools to pay attention to students at all parts of the achievement spectrum, not just those at or around the proficiency bar.
Second, ODE plans to meet a new ESSA requirement—tracking the progress of English language learners (ELLs)—by creating a new indicator within the Gap Closing component. Instead of using Ohio’s general state assessments, this measure of ELL progress will use an alternative assessment: the Ohio English Language Proficiency Assessment (OELPA). The ELL portion will take into account English learners who attain proficiency on OELPA and those who make improvements but have not yet met the proficiency standard.
Third, Ohio is proposing to reduce its minimum “n-size” used for the Gap Closing component from thirty to fifteen students. For example, under the current n-size rules, a school with twenty students with disabilities would not be held accountable for their achievement as a separate subgroup. But it would under ODE’s ESSA plan. The upside of this proposal is that more schools will be held accountable for the separate performance of more subgroups, yielding greater transparency around results. The tradeoff is that it could force otherwise high-performing schools into one of Ohio’s intervention categories—a "focus" or "watch" school—based on the achievement of a smaller number of pupils than in previous years.
Two major concerns persist on Gap Closing.
First, ODE will continue to use annual changes in subgroup achievement as a measure of improvement. Quite frankly, they should stop doing this. This calculation doesn’t account for changes in the composition of a school’s subgroups from year to year. As a result, it is not necessarily correct to imply that a school with a higher performance index score for, say, black students than in previous years “closed the achievement gap.” It might’ve been due to an influx of students with stronger academic backgrounds.
Perhaps the simplest approach on Gap Closing is to just say it’s a status measure and call it a day. In other words, document subgroup achievement gaps (if they exist), but don’t try to evaluate whether a school is narrowing it from one year to the next. As Matt DiCarlo of the Shanker Institute writes, “[changes in achievement gaps] are poor gauges of school performance and shouldn’t be the basis for high-stakes rewards and punishments in any accountability system.” For more on the problems of “gap closing” in accountability settings, see DiCarlo’s great posts here and here.
Second, ODE should reconsider how it premises sanctions on Gap Closing grades. Its ESSA proposal says that schools earning a D or F on Gap Closing for two consecutive years will land in “school improvement” status. As our colleague Jamie Davies O’Leary discusses, an overwhelming number of schools and districts currently receive poor Gap Closing ratings. ODE should make sure that it is not going to sanction hundreds, if not thousands, of Ohio schools based on the results from a single report-card component. While they’re at it, policymakers should also reduce the weight on Gap Closing in Ohio’s summative grading system and instead put student growth closer to the center of accountability.
Ohio’s Gap Closing component puts the Buckeye State into compliance with several key federal requirements. While important to carry out, policy makers should consider a less complicated and less punitive approach to subgroup accountability in Ohio.
NOTE: The Joint Education Oversight Committee of the Ohio General Assembly is hearing testimony this week on Ohio's proposed ESSA accountability plan. Below is the written testimony that Chad Aldis gave before the committee today.
Thank you Chairman Cupp, and members of the Joint Education Oversight Committee, for giving me the opportunity to provide testimony today on the Ohio Department of Education’s proposed ESSA plan.
My name is Chad Aldis, and I am the Vice President for Ohio Policy and Advocacy at the Thomas B. Fordham Institute. The Fordham Institute is an education-focused nonprofit that conducts research, analysis, and policy advocacy with offices in Columbus, Dayton, and Washington, D.C. Our Dayton office, through the affiliated Thomas B. Fordham Foundation, is also a charter school sponsor.
I’d like to first applaud the department for their hard work on this plan. ODE staff worked tirelessly to gather a massive amount of stakeholder feedback, and many of the recommendations that they heard throughout the state can be either seen as a part of this plan or are identified as areas meriting further study. I know you’ve listened to testimony from a number of people who felt that their voices weren’t heard. As legislators, you know as well as anyone that it’s extremely difficult to incorporate feedback that, while important and strongly valued, is diverse and many times contradictory.
The ESSA plan created by ODE is a thoughtful approach that strikes an important balance between meeting the federal requirements and protecting Ohio’s autonomy. While the impact and role of the new federal education law has generated much discussion, the most important thing that ESSA does is return more authority over education to the state and local school districts—where it belongs.
Before I comment on the content of the plan itself, let me offer a suggestion regarding process. This plan should be as limited in scope as possible. That’s because, once it is approved by the U.S. Department of Education, it is locked into place for many years to come. Revising it will be a hassle and require cooperation from officials in Washington. Thus we should resist the urge to put everything but the kitchen sink into the plan. We should stick to the plan requirements, and leave other important policy to be decided at the state level as necessary.
Worth noting, many of the changes being suggested have been lobbied for in front of this body in the past. Some of the most notable examples include the role of teacher evaluations, the quantity of tests administered, and the state’s school grading system. A fair amount of the criticism is coming from people and entities that didn’t like the decision reached by the General Assembly the first time around and have seen the ESSA engagement requirement as an opportunity to have a second bite at the apple. That’s fine and to be expected. However, if we hijack the legislative process by creating policy through our ESSA proposal, Ohio could find itself right back in an NCLB environment where we were forced to carry out a plan that failed to take into account local contexts, needs, and solutions.
Shifting to the contents of the plan itself, here are some things that Ohio’s ESSA plan should be commended for:
Of course, no plan is perfect and this one is no exception. The provisions below should either be removed from the plan or, where appropriate, the legislature should consider amending current law (these recommendations are underscored) to address the underlying issue.
While I agree with many who have testified and suggested that Ohio’s ESSA plan can be improved, I disagree with those suggesting that Ohio delay its application until September. If you believe that as a matter of sound public policy Ohio should promise to do only that which federal law requires thereby preserving its autonomy in other areas, the best course of action is to submit our state plan for federal approval as soon as possible. This plan largely does that.
Moreover, this plan is effective for the 2017-18 school year and local school districts deserve a certain degree of certainty when a school year begins. Waiting until September to submit this application could force districts to operate for months without knowing for sure what the rules of the game are—especially if the federal government pushes back on any of our template submitted elements. This should be avoided.
Thank you again for the opportunity to speak with you today. I am happy to answer any questions that you may have.