State tests are an important annual check-in for parents, teachers, communities, and policymakers, as they provide an objective assessment of student achievement based on grade-level standards. Accompanying growth measures offer a different, though no less important, snapshot of student progress from year to year. Because state tests are standardized, they make it possible to track achievement and growth over time, and to compare student populations and schools. Such comparisons are often the best way to pinpoint problems (say, for instance, learning loss courtesy of a global pandemic), evaluate the effectiveness of specific policies (like third-grade reading retention), and ensure that all students—regardless of race, gender, disability, or geographic location—have access to quality schools and educational opportunities.
In short, the information that tests provide is crucial. But that doesn’t mean there isn’t room for improvement. In fact, there are plenty of ideas out there about how to make tests more engaging, how to gather better information, and how to make sure the process remains rigorous and fair. Not all of these ideas are a good fit for Ohio. But there is one that Ohio leaders should consider: computer adaptive testing.
As the name suggests, computer adaptive tests adapt to students during the assessment process via computer software. When taking traditional, “fixed form” tests—what Ohio currently uses for its state exams—the difficulty level is the same for all students. With adaptive tests, however, the difficulty of students’ questions is determined by their previous responses. Although all students are tested on their current grade level standards, students who answer questions correctly are provided harder items and those who struggle are provided easier ones. (To better understand how questions are selected for each student, check out this video from the Virginia Department of Education.) The goal is to provide students with questions that are neither too difficult nor too easy. By targeting questions to a student’s ability level, the assessment provides a much clearer picture of what students know.
Computer adaptive tests have several benefits. For students, taking a test that finds a happy medium between questions that are too challenging and too easy is a much more engaging process than traditional testing. Students who are advanced are less likely to get bored, those who struggle are less likely to get frustrated, and all students benefit from a more personalized opportunity to show what they know. Furthermore, because customizing questions can determine mastery quickly, the test could end up being shorter for some students. That’s not possible with fixed form assessments, which provide a specific number of questions that must be answered regardless of whether a student has already demonstrated mastery.
Many Ohio students are also already familiar with adaptive assessments thanks to the tests their schools administer for diagnostic purposes. NWEA, for example, has been approved as an alternative testing vendor by the Ohio Department of Education and Workforce. Over 40 percent of Ohio students take MAP growth, NWEA’s computer adaptive assessment that measures student performance in math, reading, and science. i-Ready, another vendor that’s been approved by the state and is widely used in Ohio schools, offers adaptive tests. Ohio’s Alternate Assessment for Students with the Most Significant Cognitive Disabilities—a federally required state assessment—is computer adaptive. The SAT is now fully digital and adaptive. And the Armed Services Vocational Aptitude Battery (ASVAB) is computer adaptive, too.
For educators and parents, adaptive tests promise more actionable results. By providing a more precise assessment of where students excel and where they struggle, these tests can help teachers and parents personalize support and intervention to address specific topics rather than general, grade-level areas.
For policymakers, advocates, and researchers, adaptive tests could lead to better, more exact measures of student growth on state report cards. This has the potential to allow for a more precise gauge of whether students learned a year’s worth of material in a year’s time, something current state assessments don’t do. If so, we’d have a clearer picture of which schools are doing excellent work and which ones might need more support. Transitioning to adaptive testing should also be relatively easy. ESSA, the federal education law, permits states to administer statewide assessments in the form of computer adaptive tests. And although Ohio would likely be required to update its ESSA plan to make the switch, state officials wouldn’t need to worry about federal approval. Many states already administer computer adaptive tests as their annual exams, including Virginia, Michigan, and Indiana.
Given all these benefits, it seems like a no-brainer for Ohio leaders to consider transitioning to computer adaptive testing. But the key word here is “consider.” Even the smallest change to state tests can cause controversy and strife if handled incorrectly, and this wouldn’t be a small change. Furthermore, although many students and teachers might be familiar with adaptive testing, many parents might not be. Failing to communicate clearly and preemptively address misconceptions could be a death knell. Timing is also an issue. With so many other policy challenges on their plates, state lawmakers need to proceed with caution.
To ensure that computer adaptive tests don’t become yet another promising idea that’s doomed by hasty adoption or lackluster implementation, state policymakers should call for a feasibility study. As part of this study, officials at the Department of Education and Workforce could examine the work of other states, evaluate different ways to bring adaptive testing to Ohio, gather feedback from key stakeholders, and recommend ideas that could head off potential problems. The results of this study would give leaders plenty of Ohio-specific feedback to incorporate into eventual policy and practice. Most importantly, starting with a feasibility study would ensure that Ohio has all its ducks in a row before making such a big change. Computer adaptive testing could be great for the Buckeye State—but only if we do it right.