Ohio’s charter school reform discussions have mostly focused on sponsors—the entities responsible for providing charter school oversight. Overlooked are the important changes in Ohio’s charter reform law (House Bill 2) around operators. Operators (aka management companies) are often the entities responsible for running the day-to-day functions of charter schools; some of the responsibilities they oversee include selecting curriculum, hiring and firing school leaders and teachers, managing facilities, providing special education services, and more. (To get a sense of the extent of operator responsibilities, read through one of their contracts.)
Extra sunshine on operators has been especially needed in a climate like Ohio’s, where operators historically have wielded significant political influence and power not only with elected officials but even over governing boards. For instance, one utterly backwards provision pre-HB 2 allowed operators to essentially fire a charter’s governing board (with sponsor approval) instead of the other way around—what NACSA President Greg Richmond referred to as the “most breathtaking abuse in the nation” in charter school policy.
HB 2 installed much-needed changes on this front, barring the most egregious abuses of power and greatly increasing operator transparency. The legislation required that contracts between charter boards and operators be posted on the Ohio Department of Education (ODE) website; that operators collecting more than 20 percent of a school’s funding provide a detailed statement of expenditures; that ODE post a simple directory of operators so the public could know which operators were affiliated with which charter schools—information surprisingly difficult to come by outside of inside charter circles; and that ODE publish an annual academic performance report for operators. These new provisions were at once somewhat obvious, yet revolutionary. Such is the Ohio charter story.
The new performance reports are out, and that’s a great step forward for Ohio where public information on operators has been historically lacking. But the reports are disappointing in their lack of depth and breadth. The image below shows one report in its entirety; fifty-three operators received a similar half-page report delineating academic performance, attendance, student demographics, and staffing data.
[[{"fid":"117775","view_mode":"default","fields":{"format":"default"},"type":"media","attributes":{"height":"359","width":"500","class":"media-element file-default"},"link_text":null}]]
Here are a few observations about the reports and where they could be improved.
- Operators are not matched with their affiliated schools. This information is available by viewing a separate spreadsheet on ODE’s website, but each management company’s schools should be listed within the report card itself to provide context. Readers should not have to search through multiple spreadsheets and documents to piece this information together.
- There are no data on individual schools. Along with the charter schools run by each operator, the performance report should provide key report card ratings for each school. What good is a report card that lists a score for “Center for School Improvement, LLC,” an operator with no known website, without knowing which schools it oversees or how they each perform in key areas like performance index and growth? District report cards contain links to their schools’ ratings; so should operator reports.
- Academic ratings don’t effectively differentiate quality because almost every operator received a low rating. Nine operators received a “0” academic rating; seventeen received a “1” and five received a “2.” It appears that the scores (1-5 correlating with an A-F scale) were calculated in the same manner as academic ratings for sponsors. (The report does not include a methodology for calculating the operators’ academic rating.) If so, that means that student growth was counted as just 20 percent of the overall score. That’s a problem, because the other indicators composing the score are highly correlated with students’ socioeconomic backgrounds. Overall low ratings among charter operators are primarily a function of the fact that they serve so many at-risk students. The same would and will be true for traditional urban public school districts should the state calculate them in the same manner. The system fails to meaningfully distinguish between some of Ohio’s best operators—networks that get poor students who are behind grade level and move them to performing above the state average, like United Schools Network—and some of its lackluster ones. That needs fixing.
- There is no distinction between for profit and non-profit charter management companies. Charter opponents tend to speak about the charter sector in broad brush strokes. They often generalize about the “privatized” or “corporate-run” charter industry while failing to acknowledge that there are a fair number of schools in Ohio that contract with non-profit management organizations. Many choice critics seem to genuinely misunderstand the distinction or be unaware of which entities are which. A designation of non-profit versus for-profit status on each operator’s report cards could help improve public understanding and either prove or disprove people’s preconceived notions.
- The reports focus heavily on inputs. This includes a plethora of data on teachers and staff while at the same time providing hardly anything about actual performance. Readers can see the number of music teachers staffing an operator’s schools, but have no idea which schools they are or how they perform. Student enrollment numbers are not even provided—they should be.
- It isn’t clear how operator is defined. Fifty-three operators are listed in ODE’s operator database elsewhere, but only forty-nine received a report card. Why? The recent competitive facilities grant award for top-performing charter schools listed some eligible operators (defined earlier this year by ODE and the Ohio Facilities Construction Commission as such), yet not all of those operators received report cards. It’s unclear how the state is defining what constitutes an “operator” or why this definition would differ from the facilities grant eligibility list or from its own master spreadsheet.
- Expenditures per pupil is listed, but lacks context. Those numbers range from $1899 to $10,880 among various operators, but without information on overall revenue and expenditures or school-by-school information. To be fair, HB 2 required any management company earning more than a 20 percent fee from a school’s annual gross revenues to provide a more detailed financial accounting, which includes information on salaries, wages, benefits, utilities, buildings, equipment, and more. But to the best of my knowledge, this information isn’t available publicly yet—at least not in a way that is easy to find and navigate. That should change.
Ohio evaluates sponsors in significant part based on their schools’ performance, and these evaluations include detailed information about schools’ academic results as well as compliance with various rules and laws. For operators, however—entities that are actually running schools day to day, and in some instances collecting more than 90 percent of schools’ public funds—there is very little information.
HB 2’s operator transparency provisions are necessary to provide valuable information to governing boards, sponsors, taxpayers, the public, and parents more broadly. Taken together, the newly available information on operators is a step forward for Ohio’s charter sector, and ODE deserves credit for creating the first operator performance reports and doing it on time. However, there is still much room to improve the report. In the interest of transparency, Ohio should move toward a much more robust and detailed 2.0 version.