Caveat emptor: Ohio lawmakers shouldn’t follow the lead of California
Individual student data is superior to aggregate school-level data
Individual student data is superior to aggregate school-level data
A push by some charter advocates resulted in a last-minute amendment to House Bill 2 which may introduce the “California Similar Students Measures” (CSSM) into Ohio’s school-accountability system. This is an entirely unnecessary effort, and CSSM should not be implemented in the Buckeye State.
The California Charter Schools Association developed CSSM, a simple regression model that uses school-level data, to approximate a value-added student growth model. The reason: California does not have an official student growth measure. CCSM is an improvement over using only a school’s raw proficiency results to evaluate schools, and the organization deserves credit for implementing it in California. However, a CSSM-like analysis should only be used in the absence of a proper student growth measure—and as such, it has no place in Ohio.
Ohio legislators should read very carefully CCSA’s own caveat emptor (emphasis added):
While CCSA believes these metrics [CSSMs] are an improvement on the existing measures in law for charter renewal, longitudinally linked, individual student growth data is the ideal source for most appropriately assessing a school’s performance. Because the Similar Students Measure is calculated with aggregate school-level data, it is an approximation of value-added modeling. True value-added modeling requires individual student data connected to the schools and educators instructing those students.
Then lawmakers should read the statement above again, with this information in mind:
Ohio, along with other states like Pennsylvania, North Carolina, and Tennessee, uses the SAS/EVAAS value-added model. Indeed, this model uses longitudinally linked, individual-level data to calculate a school’s value-added result.
The clear advantage of using student data is that they allow analysts to distinguish actual student learning gains, tracked over time. This is the crucial information needed to fairly and robustly evaluate a school. The student data allow analysts to answer the fundamental question: Is a school making an impact on students (i.e., contributing to learning gains)? This is why many states, including Ohio, have wisely incorporated a value-added model into their accountability systems.
Worth noting also is that the finest education research uses student-level data, generally avoiding school-level data. Prominent researchers have used rich troves of (non-identifiable) student-level data to evaluate states’ charter schools, the long-run benefit of effective teachers, voucher and tax credit programs, early college high schools, and school closures.
Unfortunately, school-level data do not allow analysts to chart student learning trajectories. As a result, analysts cannot make strong claims about the impact of a school when using only school-level data—which explains the clear note of caution from CCSA.
Moreover, using school-level data could create perverse incentives that may actually harm students. When applied widely, a CCSM-like model could encourage schools to manipulate their student demographics to their own advantage. For example: A school could be tempted to counsel out low-achieving students, since raw achievement, on school level, is the outcome variable used in CCSM’s regression. (The same perverse incentive exists when a proficiency measure alone is used for accountability.) Yet because it examines students’ learning trajectories over time—not a static measure easily known to school administrators—the SAS/EVAAS model is less susceptible to manipulation.
The SAS/EVAAS statistical model isn’t the Holy Grail of student growth measures. But it’s a whole lot better than the California model, both in its technical properties and the incentives that it establishes. If the legislature adopted CSSM for use, it would be like trading in a Ferrari for a horse and buggy.
***
So what’s with the dissatisfaction with the SAS/EVAAS model, and the sudden impulse to introduce the California model? I’ve heard several off-base criticisms of Ohio’s value-added model, which are based on misconceptions. Let’s address two particular issues that may be driving the discussion.
One argument is that the SAS/EVAAS model cannot be valid because it doesn’t yield a “bell-shaped quality curve.” Many, including myself, have observed the odd distribution of the state’s A–F value-added ratings (Figure 1). Something certainly seems amiss.
Figure 1: Distribution of value-added ratings, Ohio schools, 2013–14
[[{"fid":"114450","view_mode":"default","fields":{"format":"default"},"type":"media","link_text":null,"attributes":{"class":"media-element file-default"}}]]
But in reality, the value-added scores yield a fairly normal, bell-shaped curve at the school level (Figure 2).
Figure 2: Distribution of value-added scores, Ohio schools, 2013–14
[[{"fid":"114451","view_mode":"default","fields":{"format":"default"},"type":"media","link_text":null,"attributes":{"class":"media-element file-default"}}]]
Source: Ohio Department of Education. Notes: The value-added index scores range from -24.34 to 22.10; a small number of outlier schools are excluded from the chart for clearer display. The mean score is 0.98, and the number of schools is 2,573. The index score is the average estimated gain/standard error; the index scores are used to determine schools’ A–F rating. For the cut-points in the A–F ratings, see here.
The problem isn’t with the results of the value-added model. Rather, the oddly shaped A–F rating distribution is a function of where the legislature has set the “cut points” for each letter grade (ORC 3302.03). Something is lost when the value-added scores are translated into value-added ratings. (The A–F cut points may need to be set differently, but that’s a topic for another day.)
Second, the SAS/EVAAS model is sometimes characterized as a “black box.” It is true that the statistical modelling is complex and that the student data—as they should be—are not made available to the public. Yet there is also ample documentation of the procedures, business rules, and method that SAS uses to conduct the data analysis. You can read its technical manual and a brief that summarizes the models (and common misconceptions) in everyday language. Meanwhile, there is no clear reason to believe that SAS—a widely respected IT company (think IBM or Oracle)—would be anything less than an independent and credible partner when analyzing Ohio data.
In the end, the precise mechanics behind products are often unfamiliar to the layperson. Do I perfectly understand the inner workings of a car engine? Nope. Could I describe, in exact detail, how a cell phone functions? Or the World Wide Web? I could not. But neither do I disparage car engines, cell phones, or the Internet as “black boxes.”
***
Let’s be frank. Those most apt to criticize the SAS/EVAAS value-added model are probably also those affiliated with the schools that perform poorly along this measure. It’s no secret that in the education world, virtually everyone believes their own school is da bomb—even when there may be trouble brewing. When the results aren’t glowing, it’s easy to criticize the methodology or dismiss the results as “invalid.” Sometimes they’ll even cook up their own methodologies that (surprise!) make their schools look better. All sour grapes, I say.
There is no compelling reason for Ohio to incorporate the California model into the state accountability system. Compared to Ohio’s value-added measure, CSSM is a very crude way of getting at student growth. Heck, don’t even take my advice—just listen to its creators.
Many people have misconceptions about career and technical education (CTE) that are grounded in an archaic view labeling CTE as “blue-collar stuff” for kids who aren’t on a college path. A recent piece in the Wall Street Journal, however, points out that “CTE today is far more demanding than vocational tracks a generation ago, which were often seen as dumping grounds for students who couldn’t handle college-preparatory classes.” Richard Kahn, the chief executive of a CTE school in Manhattan, says that his school’s goal is to “get everybody into the middle class economy.” In a guest piece on Flypaper in March, Sean Lynch of the Association for Career and Technical Education noted that CTE programs also “open doors to new career exploration opportunities, lower high school dropout rates, and engage at-risk students with interesting curriculum.”
But what does CTE look like on the ground? For answers to these questions, let’s take a look at Ohio’s career and technical education programs.
Beginnings
In Ohio, the law requires public schools to provide students the opportunity to take CTE courses beginning in seventh grade (though most students wait until high school to enroll). Ohio’s CTE programs are designed to align with the technical content standards of a chosen field (which allows for hands-on training and real work experience) and the state’s academic standards. This means that every CTE program is required to teach the state’s required academic content standards in math, ELA, science, and social studies (in Ohio, the math and ELA standards are Common Core).
The Ohio Department of Education lists eighteen career and technical programs, ranging from agricultural and environmental systems to law and public safety. Each program offers specific courses that must be completed, follows content standards specific to its field, and requires completion of a career-technical assessment. Several programs comprise multiple disciplines. For example, the law and public safety program offers pathways for law, emergency medical care, fire science, forensic science, and criminal justice.
Governance
Ohio has two main groups responsible for overseeing career and technical education: Career and Technical Planning Districts (CTPDs) and Joint Vocational School Districts.
CTPDs are local agencies that handle the administrative duties of CTE. There are currently ninety-one CPTDs in Ohio. They are made up of varying members—which can be school districts, charter schools, STEM schools, or others—but are run by one “lead” district that approves the various programs those members use. Hudson City Schools is the lead district for a CTPD called the Six District Vocational Education Compact, which also contains five other local school districts: Cuyahoga Falls, Kent, Stow-Munroe Falls, Tallmadge, and Woodridge. Each of these districts runs CTE programs that students from other districts in the CTPD are able to attend.
JVSDs, on the other hand, are direct education providers. They are similar to traditional school districts in that they serve students from a certain regional area, but they also exclusively offer specialized CTE programs. Many JVSDs, in addition to serving specific districts and areas, also utilize open enrollment: The JVSD must adopt a resolution that either prohibits or allows students from other districts to attend. In Ohio, there are forty-nine JVSDs (each one is a member of a CTPD). For example, the Lorain County JVSD is the lead district in a CTPD that also contains thirteen other traditional school districts.
Accountability and results
Just as the performance of district and community schools is measured and rated in a publicly released state report card, CPTDs also receive report cards (which are published online). CTPD report cards assign the same A–F letter grades that traditional district schools receive, and include three of the same components: achievement, graduation rates, and preparation for success. Two of the report card components, however, are measured using slightly different, CTE-specific metrics. The achievement component is measured by how many students pass the assessments relevant to their CTE program, while the prepared-for-success component is measured via dual enrollment, AP participation, and honors diploma numbers. CTPD report cards also include a post-program element, which measures the number of students who left high school the previous year and are employed, in an apprenticeship, in postsecondary education or advanced training, or in the military. This measure additionally tracks the number of students who left school and earned industry credentials.
While CTPD report card results vary, there is anecdotal evidence that the programs are moving in the right direction. Back in February, Education Week posted a piece highlighting two Ohio CTE schools that focus on global education. The Ohio Department of Education has an entire page devoted to CTE success stories. Joseph Neyhart, a graduate of the Toledo Technology Academy (TTA), discussed in a recent Huffington Post piece how today’s CTE is vastly different than “last century’s voc-tech programs.” Neyhart didn’t just get the hands-on experience that we might expect from a CTE program: He also participated in national competitions, earned a full-year internship at General Motors, and earned twenty-six college credits in classes like calculus, computer-aided design, and mechanical engineering. When Neyhart graduates from college (he just finished his freshman year), he’ll likely have a full-time position already set up—his boss at his new internship has already expressed interest in hiring him.
Of course, anecdotes aren’t real evidence of either success or quality. This policy analyst wants data, and lots of it, which is hard to come by due to the byzantine system of reporting for CTE programs in Ohio. Parents certainly deserve more and better information. What we do know is that there are CTPDs with lackluster results, just as there are CTPDs with great ones. (It is also worth noting that CTPDs in districts that have struggled academically, like Akron, Youngstown, and Lorain City, are posting some encouraging numbers.) There is promise and room for growth in CTE programs—but so far, they’re on the right path.
It wasn’t that long ago when you could go from one end of your K–12 education to the other without even laying eyes on a student with a disability. “In the early 1970s, these youths were marginalized both in school and in life, with only one-fifth of children with disabilities even enrolled in public schools,” notes Education Week, whose tenth annual “Diplomas Count” report focuses this year on students with disabilities. Today, nearly six million such students are enrolled in U.S. public schools, with the vast majority studying alongside non-disabled peers. They are “coming of age at a time when they, like all high school students, are increasingly expected to perform to high academic standards and to prepare for further education or training and a productive role in the workplace,” the authors observe.
How are they doing? Eighty-one percent of our public high schools students can now expect to march across stage and be handed a diploma within four years; that’s both a historic high and the headline finding of “Diplomas Count 2015.” However, the graduation rate among students with disabilities is 62 percent—a figure that masks wild (and somewhat suspicious) variations from state to state: from a low of 23 percent in Mississippi to a high of 80 percent in nearby Arkansas. Education Week is particularly strong in unpacking those disparities, which can be heavily influenced by both discipline practices that “disproportionately affect special education students” and variations in state graduation requirements, some of which “may be less rigorous for students with disabilities than for their peers.”
In Fordham’s home state of Ohio, the data is a mixed bag. The adjusted cohort graduation rate for students with disabilities is 69 percent, above the national average. Students with disabilities are also above the national average for time spent in regular classrooms (with the exception of students with emotional disturbances, whose mainstreaming lagged in Ohio). NAEP proficiency for Ohio’s students with disabilities is above the national average in math but below in reading, and both drop sharply between fourth and eighth grade, along with their peers nationally. For Ohio’s non-disabled students, proficiency rates above the national average are de rigueur. Achievement gains are generally impressive for Ohio’s disabled students (except in eighth-grade reading), while achievement gains for non-disabled students are above the national average (except in fourth-reading).
The theme that emerges is the need for early and comprehensive transition planning to prepare special education students to go it alone, without the resources and supports they receive during their school years. Profiles of young adults with a range of disabilities (a lab school student now successfully attending culinary arts school; twin brothers who graduated high school with modified diplomas and found jobs but “could have benefitted from more career direction in high school”) go a long way toward making the package more affecting than a typical data-fest. The key to preparing students like these for launch is setting “ambitious but realistic goals for students with disabilities, and helping them navigate the often-unfamiliar terrain of the post-high school world.”
In Ohio, vocational/career transition planning for students on Individualized Education Programs (IEPs) has long been required. Two years ago, the state lowered the age at which this planning must begin (from sixteen to fourteen); it also made changes to require clear statements of goals and the services being provided to meet them. The intention was to give children a voice in the plan and its execution, and for the state to provide oversight in this area.
To be sure, “Diplomas Count” offers data by the dump truck load. A nifty interactive map allows users to make instant comparisons of state graduation rates, sortable for students with limited English proficiency, socioeconomic disadvantage, race, and other subgroups. Some of the more interesting nuggets: Nationally, graduation rates for disadvantaged racial and ethnic groups remain substantially below those of their white and Asian peers; graduation rates are lower for students with disabilities in every state; and the largest gap between disabled students and the at-large graduation rate is 53 percent, in Mississippi (Alabama’s 3 percent gap is the smallest).
The outlook for students with disabilities after graduation, the report concludes, isn’t negative—just mixed.
SOURCE: “Diplomas Count 2015: Next Steps: Life After Special Education,” Education Week (June 2015).
Elsewhere in this issue, you read about the "Youngstown Plan," sharpening the teeth of Ohio’s Academic Distress Commission (ADC) protocols for persistently troubled school districts. While newspaper editors and citizen groups in Youngstown have been calling for something stronger than the existing ADC for a while now, it is a singular moment of opportunity that has facilitated the new plan’s rapid adoption. The re-retirement of former Youngstown Superintendent Connie Hathorn and the instatement of a six-month interim supe is a perfect setup for this transition. Youngstown has been in academic and financial trouble for decades, and the district has been formally under the ADC’s thumb for the past five years, yet the needle of success has barely budged.
Meanwhile, in Ohio’s other current ADC district, Lorain City Schools, a new superintendent was named the same day the Youngstown Plan passed. As the vote concluded, the chair of Lorain’s ADC sounded a warning that the new legislation could also become the “Lorain Plan,” which would include the selection of a new CEO and the creation of a new commission light on local appointees. He’s right: Lorain’s ADC, like Youngstown’s, has struggled mightily to succeed in recent years. Other districts, including Dayton and Trotwood-Madison, are also currently at risk of entering the ADC process due to persistent academic struggles.
These changes pale in comparison to the unique structure implemented in Cleveland, which includes mayoral control; an appointed board; a CEO; laws that make parental involvement mandatory; a focus on building-level autonomy; and unprecedented efforts at integration of district, charter, and STEM schools. We have remarked on signs of success and signs that more work remains to be done. The latest report from the Cleveland Transformation Alliance shows an encouraging decrease in the number of kids in failing schools and an increase in the number of students in high-performing schools since 2012. Still, too many students in Cleveland remain too far behind academically to claim success, and even die-hard supporters want to see faster progress.
But when a similar clean-sweep strategy was proposed for Columbus in 2013, no amount of bipartisanship, cooperation between state and local government, or heavyweight education commission members could induce voters in the capital city to accept such immediate and disruptive change to the status quo. Surely the defeat can be blamed on distaste for the district’s scandal-ridden ancien régime. But a dynamic superintendent and a reconfigured board are getting off the mat and taking steps to break out from stagnation. The city should build on the positive momentum and rally community support to expand high-quality schools.
Over the years, we at Fordham have documented the urgent need for excellent schools in Ohio’s urban communities. Too many needy students—we daresay more than fifty thousand Ohio youngsters—remain trapped in bad schools, and therefore denied their one opportunity to succeed academically and have a better chance at a happy life. These children deserve the absolute best from our local and state leaders, and whether it is bold change in Cleveland, a rejection of the bad old days in Columbus, or a new idea coming soon to Youngstown, it is to be hoped that many more children have a brighter future ahead of them.
Last week, Ohio policymakers took a bold step toward strengthening education in persistently low-performing districts. House Bill 70, which passed both legislative chambers, grants significant new powers and responsibilities to the state’s academic distress commissions. Among the key provisions is a call for an appointed chief executive officer who would lead each district’s reform efforts.
Created by the state in 2007, academic distress commissions are triggered when districts fail to meet basic academic standards. Presently, two districts—Youngstown and Lorain—are overseen by separate commissions. These are the key features of the commission, as specified under present but now soon-to-be retired state law:
Unfortunately, these arrangements were largely toothless. The commission existed only to assist the district and to draw recovery plans—not necessarily operationalize them. It also left intact existing governing entities that could block the commission’s efforts. The locally elected school board and district superintendent, for example, remained in place—a recipe for political turf wars and bickering.
In Youngstown, the tension between local officials and the commission has been palpable and clearly counterproductive. According to the Youngstown Vindicator, there has been “almost constant carping” by the school board about the commission. Meanwhile, the district superintendent, fed up with school board politics that “killed his spirit,” recently high-tailed it out of town for a job in Arkansas. On the heels of this turmoil, the Vindicator pleaded for Governor Kasich and state authorities to become more aggressively involved.
So state leaders answered the call.
House Bill 70 considerably strengthens the commission in a few ways. First and foremost, the commission will appoint the district’s chief executive officer, effectively eliminating the local board’s right to hire a district chief. The CEO will be vested with full managerial rights and operational control. Second, the law will strip the school board of one of its two commission appointments. Instead, the law allows a local executive, such as the city’s mayor, to make the appointment. Third, the commission will be responsible for expanding high-quality school options, including support for community learning centers and reconstituting failed schools. Special state funds may be appropriated to help to kick-start the growth of better choices.
Some of these revisions reflect movements across the nation, in which states are ratcheting up their intervention policies to deal with chronically sick school systems. In Tennessee and Louisiana, for example, state leaders have stepped in to create recovery school districts (RSDs). Chris Barbic, the superintendent of Tennessee’s recovery district (known as the Achievement School District), has been widely credited with leading visionary, albeit challenging, reforms. Barbic and his team have recruited high-performing charter operators, leveraged the talent pipeline of Teach for America, and created a teacher residency program. The results from New Orleans and Memphis have been promising, and other states are now exploring or implementing RSD-like models.
Ohio leaders aren’t fiddling while Rome burns either. Governor Kasich and the legislature enacted the Cleveland Reform Plan in 2012, and it’s beginning to bear fruit. Academic distress commissions, a relatively new tool for reform, also have strong potential. But as originally conceived, the commissions were far too weak. By strengthening the hand of the commission—and empowering a CEO with full management rights—House Bill 70 should greatly improve education in Ohio’s neediest communities.
“The buck stops here,” said the famous sign on President Truman’s desk. The statutory revisions to Ohio’s academic distress commissions will help to ensure that a willing and able executive is in charge of districts desperately in need of reform.
In the midst of debates about whether school is the best place to combat the effects of poverty, several educational institutions have taken it upon themselves to integrate non-academic poverty-relief supports into their academic programs. According to a new report from the Clayton Christensen Institute for Disruptive Innovation, these schools offer unique on-the-ground efforts to support high-need students above and beyond the traditional academic model. They include KIPP, SEED schools, the Harlem Children's Zone, and community-based schools like those found in Cincinnati Public Schools (CPS).
Each organization offers its own take on anti-poverty programming. KIPP focuses on extended school days and years, character education, and initiatives like KIPP Through College, which includes step-by-step assistance in the college admission process as well as after-school tutoring and counseling. These are services that other high-poverty schools struggle to offer. KIPP is also extending its services in specific locations; KIPP Houston, for instance, features a school-based health clinic called KIPP Care. The SEED schools, meanwhile, take efforts even further with a one-of-a-kind public boarding school model: Those enrolled live on campus five days a week, then head home for the weekend. Students, many of whom come from disadvantaged communities, benefit from a character education program, mental health and counseling services, off-campus and summer enrichment opportunities, and college transition advising. Both SEED and KIPP boast impressive academic achievement numbers. (For a look at a potential SEED school in Ohio, see here).
The Harlem Children’s Zone (HCZ), meanwhile, is a nonprofit serving a ninety-seven block area of its namesake neighborhood. Its umbrella of assistance includes parenting workshops, preschool programs, charter schools, after-school programs, a college success office supporting HCZ college students, and free legal and financial services for families.
The report also looks at community schools, which centralize a range of services inside one building by partnering with local service providers. (To be clear, this term doesn’t refer specifically to charter schools, which Ohio law refers to as “community schools.”) Cincinnati Public Schools is currently transitioning fifty-five of its schools into Community Learning Centers (CLCs), which join with a variety of outside organizations to offer recreational, educational, social, healthy, civic, and cultural opportunities to students, families, and the broader community. Its ethos mirrors that of Communities in Schools (CIS), the nation’s largest wrap-around services organization. Both organizations operate through the use of on-site coordinators, who assess student needs and recommend services.
The report’s authors are careful to note that there are concerns around non-academic poverty supports, mainly as they relate to cost and scalability. Outsourcing services the way CLCs do (an approach they call “modular”) may seem more efficient and less expensive, but it is actually impractical in the long run because outsourcing services leads to a loss of control for schools. This loss of control means that educators on the ground—those who are most knowledgeable about what services are needed—can’t oversee the balance of services offered to each student. On the other hand, the interdependent, integrated models offered by SEED, KIPP, and HCZ allow educators to determine the exact mix of supports that students need, make them available, and then permit researchers to measure success. Only once this mix is identified can schools transition to the affordable, scalable model necessary to ensure that all students receive the help they need.
SOURCE: Michael B. Horn and Julia Freeland, “The Educator's Dilemma: When and how schools should embrace poverty relief,” Clayton Christensen Institute for Disruptive Innovation (June 2015).