How Ohio can make the transition to new test scores
An honest look at achievement should be addressed head on
An honest look at achievement should be addressed head on
Most states, including Ohio, have reported large majorities of students as proficient on annual exams over the past decade. This has led the public and parents to believe that most students are doing just fine. Sadly, however, we also know that too many young people require remedial education when they enter college, have great difficulty finding gainful employment, or can’t pass the test to serve our country in the military. A staggering 65 percent of first-year students in Ohio’s two-year colleges require remediation, while the rate is nearly 35 percent in some four-year universities.
A wide chasm—an “honesty gap”—has emerged between how student success in the K–12 realm is portrayed versus how colleges and employers view the skills of those leaving high school. To bridge that gulf, states have adopted higher learning standards, including the Common Core in math and English language arts, as well as rigorous next-generation assessments that are aligned with them. With these new exams in place, the practice of vastly overstating student proficiency is drawing to a close.
Indeed, several states have already unveiled 2014–15 results from Common Core-aligned assessments. Connecticut, a member of the Smarter Balanced Assessment Consortium (SBAC), recently reported that 39 percent of its students reached proficiency in math and 55 percent in reading. (Between 75 and 85 percent of its students were deemed proficient by its old assessments.) SBAC states like Missouri, Vermont, and Washington have reported similar results, with student proficiency of roughly 40–60 percent representing significant declines from previous years. In 2012 and 2013, Kentucky and New York—early adopters of Common-Core aligned assessments—also reported proficiency declines.
These results reflect more demanding test questions, as well as the higher expectations embedded into the definition of “proficiency.” States like Connecticut and Missouri should be commended for giving their citizens, especially parents, a clear look at just how many students are truly on track. It’s also important to note that the results from these next-generation assessments are also more in line with results from the “Nation’s Report Card” which indicates that about 35–40 percent of American students graduate at the “college-prepared” level.
Ohio policymakers should pay attention to the lower test results in other states, as declines of a similar nature are almost certain to be seen here. (Results from the 2014–15 administration of the PARCC exams are expected in late 2015 or early 2016.) In the meantime, state and local officials should take proactive steps to ensure a smooth transition to higher reporting standards when it comes to student proficiency. Here are three ways to achieve that:
First, the state board of education should recalibrate Ohio’s proficiency-based accountability metrics. The thresholds for assigning an A–F rating on the Performance Index (PI) measure will need to be reset in order to adjust to systematically lower achievement rates. For example, the PI score needed to earn an A will have to be revised from the current requirement of 108 (out of 120 possible points). The same holds true for the Indicators Met measure, for which 80 percent proficiency is currently required in order for a school to meet a specific indicator. Without adjustments, the overwhelming majority of schools could be rated a D or F along these proficiency-based measures. To maintain a credible accountability policy, the state board must ensure this doesn’t happen.
Second, as Ohio switches from PARCC to an AIR/ODE-developed exam in 2015–16, state policymakers should make certain that the standard for reaching “proficiency” remains linked to a college- and-career-ready benchmark. Once the new Ohio tests are given (if not before), the state should work swiftly to set performance standards (or “cut scores”) comparable to PARCC, SBAC, and the other college-ready assessments being widely administered. Ohio leaders cannot let the standard for proficiency slip as the state transitions to a new assessment.
Third, state and local education leaders should be ready and willing to candidly discuss the results from the 2014–15 test administration with parents and citizens. They absolutely deserve the truth about achievement, and if the PARCC results are anything like those reported by SBAC states, they will provide a sobering look at how few Ohio students are meeting rigorous grade-level expectations. While it’ll take courage to face the facts, it has to be done.
William J. Bennett, former U.S. Secretary of Education, recently wrote in Forbes about the shift toward higher standards and rigorous assessments:
While not easy, the transition marks a necessary reset that will give families a genuine measure of student development. And by raising the bar, and holding schools accountable to it, we will ensure more students are getting the resources they need to succeed at high levels of learning and to ultimately graduate high school fully prepared for a college or career of their choice.
He’s right on point: As many states make these transitions, their communities will gain a clearer picture of student achievement. But they will also need help from their education leaders to understand this new baseline. If Ohio’s leaders embrace this higher standard for student proficiency, it’ll lead to increased attention on student needs and more productive engagement with parents and communities. In the long run, these changes should breed greater confidence in our schools and better outcomes for children.
But if they shy away from, dismiss, or even openly scorn this honest look at achievement, thousands of young people will continue to graduate and then realize—much too late—that they aren’t ready to take the next step in life. It seems like an easy choice to me.
When talking about educational choice, most people focus on choosing a school. But true educational choice shouldn’t stop after a family chooses a school. After all, few schools can meet the educational needs of all of their varied students—or can they?
Course choice, a growing trend in K–12 education, provides public school students with expanded course offerings across learning environments from diverse, accountable providers. It may sound impossible, but for many Ohio students, this is already a reality. CTE programs offer personalized paths toward earning high school credits, industry credentials, and college credit. The College Credit Plus program empowers students in grades 7–12 to attend classes at participating public or private colleges after they’re admitted based on their college-readiness. For students who aren’t interested in existing CTE programs and aren’t deemed college- and career-ready, ilearnOhio seems like the perfect solution. Dubbed a “powerful tool” for students and educators alike, the online platform provides classroom resources (e.g., instructional support materials, assessments, and professional development resources) and a marketplace with online courses from a variety of developers. The marketplace offers students extended course options—but only if their family has a few hundred dollars to drop, since many of the credit-bearing courses are “fee-based.”
While some schools offer plenty of course choices and might not need ilearnOhio, not every Buckeye kid is lucky enough to attend a course-rich school. Thousands of students across the state, especially in rural or urban areas and small school districts, love the school they’re enrolled in and don’t want to choose another. But staying often means sacrificing the chance to take one—or more—courses that they want or need to take.
Ohio isn’t an anomaly. This is the case for thousands of students across the country (check out this sobering brief on limited course access from the U.S. Department of Education). I saw it firsthand as a high school English teacher in Memphis. At my urban, low-income school, students were severely limited by a lack of access to a wide variety of courses. Spanish and French were offered, but only for two years each. The only available sciences were physical science, biology, and chemistry, with no AP options. The highest math course was advanced algebra and trigonometry. Electives were rare. It was a painful reality, particularly since nearby schools’ options weren’t much better. Families and students alike asked the same questions each year: What about upper-level Spanish and French, or even an option for a different language—particularly since colleges like students with four years of a foreign language? What about the aspiring scientists who wanted to take physics, geology, anatomy, or an AP version of any of the core classes? What about the math aficionados who, without pre-calculus, statistics, or AP courses, didn’t have much of a chance to deepen their knowledge before the ACT or college? And let’s not forget electives.
A recent white paper from the Foundation for Excellence in Education explains that in states where a course access policy is in place, students are able to enroll in and earn credits from courses that aren’t traditionally taught in their own schools. Louisiana, for instance, has a course access policy that includes face-to-face, blended, CTE, and online courses. If you compare course access policies in states from Digital Learning Now’s Digital Learning Report Card 2014 with what the Buckeye State already offers, it’s clear that Ohio’s strong CTE programs and its College Credit Plus program are already two-thirds of a stellar course access policy. The remaining third is online courses that specifically meet the needs of students who aren’t interested in CTE and aren’t yet college- or career-ready. Well-intentioned as it may be, ilearnOhio doesn’t cut it as the third part of a course access policy; its course fees deny access to thousands of low-income families. If Ohio wants to complete the trifecta of course access, it must start offering free online classes to all students.
Before I lose half of my readers with sighs and grumbles about the well-documented struggles of Ohio online charter schools, let me be clear: The online courses we need for a complete course access policy don’t have to be offered through Ohio’s current online schools. In fact, I’d argue that they shouldn’t be. Ohio’s online charter schools have been an all-or-nothing proposition, forcing students to choose between their local school and an online option rather than allowing them to structure a hybrid model including both traditional and online classes. To make room for such a hybrid, Ohio could create a new mechanism for online course access—one that is independently run, financed, and held accountable for its results. The best example of this is the Florida Virtual School (FLVS), which Brookings evaluated in 2014. But Ohio wouldn’t have to copy Florida’s entire model. Instead, it could create an equivalent that complements the CTE and College Credit Plus programs. Obviously, this would need to start as a pilot program to determine its validity, but if it successfully met the needs of the kids it served, it could grow into a statewide program available to all schools.
It’s time for choice to extend beyond choosing a building. Ohio needs to take a deeper look at free course access for students. How would it work? And why would schools want to opt in to yet another new policy? Stay tuned for a more in-depth look at how Ohio policymakers could make expanded course choice a reality.
The latest report from the Center for American Progress opens with a detailed effort to define the problem of truancy. The causes are myriad: Family duties and instability at home can pull students out, while bullying and zero-tolerance policies can push them in the same direction. Regardless of the reason, chronic absenteeism has consequences for students, schools, the economy, and society. The authors successfully identify the problem for readers who do not deal with it daily, as many educators do. The definition of truancy differs from state to state, while districts and schools have wide latitude to address absenteeism. Unfortunately, these factors have conspired to virtually require the development of “customized” approaches to addressing truancy when a common menu of solutions might lead to better outcomes. The report highlights successful efforts in California (defining “chronic truancy” for the first time in state law and tracking data on it statewide), Washington, D.C. (early warning and intervention program), New York City (improved data collection, incentivizing attendance), Baltimore (student-centered non-judicial “truancy court”), and Hartford (mentoring programs for students who trigger early absenteeism warnings). From there, the authors extrapolate a variety of policy recommendations applicable to the federal, state, or local levels: Develop a national definition for truancy and its antecedents (chronic absenteeism, etc.), improve data collection and data sharing for early warning systems, increase wrap-around services and align them with student needs, reduce punitive policies that serve to push students out of school, and increase parental involvement and parental education opportunities in schools. These recommendations are all sensible—but they are, as might be expected, “traditional” in structure, advocating solutions found within the existing system. Some education reformers will read this report and want to include additional options, such as non-traditional daily schedules, alternate pathways to graduation, online courses, blended learning options, themed academies, and vouchers. The causes of truancy are varied and numerous. The policies and programs offered to combat it must be as well.
SOURCE: Farah Z. Ahmad and Tiffany Miller, “The High Cost of Truancy,” Center for American Progress (July 2015).
When we surveyed more than eight hundred college students six years ago, we found that most of them were planning to leave the state after graduation. This was a startling finding—and, recognizing its implications, Ohio leaders have made a concerted effort to retain college graduates (see here for an example). Meanwhile the job market has improved since the nadir of the Great Recession, making the Buckeye State a more attractive location for young people.
But what do the statistics say about Ohio’s ability to retain college-educated young people? According to a new Manhattan Institute analysis, a growing number of them reside in Ohio’s urban areas—what the author calls a “brain gain.” To arrive at this finding, the study focuses on twenty-eight U.S. cities that lost population and/or jobs from 2000 to 2013. Five metropolitan areas in Ohio fit those criteria: Akron, Cleveland, Dayton, Toledo, and Youngstown. According to Census data, the number of college-educated young people—in the 25–34 age bracket and with at least a bachelor’s degree—increased in all five cities. Akron was the leader among Ohio cities with a 13 percent increase in college-educated young people, while Toledo was the laggard with an increase of just 1 percent. (When viewed as a share of the overall population, the fraction of college-educated young people is also rising.)
Before uncorking the champagne, we should provide some broader context: Are these increases keeping pace with national trends? (Educational attainment in the United States has risen substantially over time; see figure 7 here.) Sadly, for Dayton and Toledo, the answer is no—their increases fall short of the national trend. Meanwhile, the increases for Akron, Cleveland, and Youngstown outstrip the national average, though they still fall behind group leaders Pittsburgh, Scranton, and Buffalo.
The study’s author recently told the Cleveland Plain-Dealer, “Places like Cleveland are better at attracting regional talent that might have skipped town [in past years].” Well said, at least in the case of Cleveland and the other Northeast Ohio cities. But retaining talented young people in the Dayton and Toledo areas remains a trouble spot—and one piece to solving that puzzle is growing high-quality schools that encourage young adults to stay and raise a family. If nothing is done, these cities will continue to falter as their counterparts in Ohio, regionally, and nationally zoom ahead.
Source: Aaron M. Renn, Brain Gain in America’s Shrinking Cities (New York, NY: Manhattan Institute, August 2015)
A new analysis from Matthew A. Kraft at Brown University links the characteristics of laid-off teachers to changes in student achievement. The analysis was conducted in Charlotte-Mecklenburg Schools (CMS), which laid off just over a thousand teachers as a result of the Great Recession in 2009 and 2010. Since North Carolina is one of five states where collective bargaining is illegal, a discretionary layoff policy was used rather than the more common “last-hired, first-fired” (sometimes referred to as LIFO—last in, first out) method. CMS identified candidates for layoffs based on five general criteria: duplicative positions, enrollment trends, job performance, job qualifications, and length of service.
Kraft estimates the effects of these layoffs on student achievement by using both principal observation scores (which directly informed layoffs) and value-added scores (which were not used to make layoff decisions). This enabled him to compare the impact of a teacher layoff based on subjective and objective measures of effectiveness. The good news for CMS students is that, overall, laid-off teachers received lower observation scores from principals and had lower value-added scores in math and reading compared to their counterparts who weren’t laid off. Kraft found that math achievement in grades that lost an effective teacher decreased between .05 and .11 standard deviations more than grades that lost an ineffective teacher.
The difference between laying off a senior teacher versus an early-career teacher was substantially smaller and statistically insignificant, suggesting that effectiveness was not a function of experience. That said, some of the most effective teachers were not only veterans but teachers who returned to teaching after retiring (sometimes called “double dippers” since they draw a pension and a salary). Unfortunately, these teachers were among the first to be laid off— an unwise decision from an achievement standpoint (though perhaps not from a financial one) since they were substantially more effective than the average CMS teacher (their evaluation scores were two-thirds of a standard deviation higher than the district average).
Three key takeaways emerge: 1) principal evaluations and value-added scores both have predictive validity, while seniority alone has little predictive power when it comes to the impact of teacher layoffs on student achievement; 2) if achievement is to take precedence, schools should prioritize performance over seniority when staff reductions become necessary; and 3) policymakers should allow school leaders to use discretion when deciding which teachers to lay off rather than setting rigid rules around reduction-in-staff procedures.
Matthew A. Kraft, “Teacher Layoffs, Teacher Quality, and Student Achievement: Evidence from a Discretionary Layoff Policy,” Education Finance and Policy (August 2015).