Critics of test-based accountability sometimes argue that there’s little evidence that schools that boost students’ test scores also prepare them for long-term success. A recent Institute of Education Sciences–commissioned study by Daniel Hubbard helps to fill this gap by examining how attending high schools with high value added affects students’ first year grades in college.
Hubbard uses student-level test scores and demographic data from public middle and high schools in Michigan to estimate school-level value-added scores and then merges them with college data to measure post-secondary grade point average. His sample includes all students in Michigan public schools who first sat for the eighth grade math and reading Michigan Educational Assessment Program state test (MEAP) between the 2005–06 and 2007–2008 school years. To be included in the sample, they must also take the eleventh grade state test and take a course in a Michigan public college within five years of taking their eighth grade test.
Hubbard uses a number of empirical adjustments and other tests of robustness to address the problem of selection and sorting into high school and college. That includes, for instance, restricting the sample to students who are very likely to go to college, meaning they meet all of the ACT’s benchmarks for college readiness. In theory, these students have less leeway for their college-going decisions to be altered by the quality of their high school.
Yet these results and others do not change the overall tenor of the key finding, which is that there is a statistically positive relationship between high schools’ value added and college course grades. The effect of attending a school with one standard deviation higher value added is about 0.09 grade points higher than the grades of an identical student in an average high school, which is about one third of the difference between a B and a B+. Results by subject area show similar, statistically positive results for both tested (math and English language arts) and untested subjects—the latter of which includes a wide spectrum of courses such as psychology, business, and even welding.
Effects are larger for black students than for white students and slightly larger for poor students. They are also larger for students in low-scoring (based on high schools’ average eighth grade exam scores) but high value-added schools. In other words, attending an effective school, as measured by value-added gains, is especially important for disadvantaged students.
Hubbard concludes that his results “imply that schools with high value added are not earning those scores by teaching to the test or by reallocating resources toward tested subjects, but instead by preparing students effectively to perform well on the standardized test and beyond.” That’s mighty good news for the high-flying schools that invest copious blood, sweat, and tears into preparing their students not just for the here and now, but for the elsewhere and later. And it’s also good news for the testing-and-accountability movement, given that it shows that test score gains are related to other outcomes most everyone agrees are important.
SOURCE: Daniel Hubbard, “More Gains than Score Gains? High School Quality and College Success,” Working Paper (October 13, 2017).