Evaluating the Content and Quality of Next Generation Assessments examines previously unreleased items from three multi-state tests (ACT Aspire, PARCC, and Smarter Balanced) and one best-in-class state assessment, Massachusetts’ state exam (MCAS). The product of two years of work by the Thomas B. Fordham Institute, two rock-star principal investigators, and almost forty equally stellar reviewers, the report uses a new methodology designed to answer policymakers’ most pressing questions: Do these tests reflect strong content? Are they rigorous? What are their strengths and areas for improvement?
As our benchmark, we used the Council of Chief State School Officers’ Criteria for Procuring and Evaluating High-Quality Assessments. We evaluated the summative (end-of-year) assessments in the capstone grades for elementary and middle school (grades 5 and 8). (The Human Resources Research Organization evaluated high-school assessments.)
Here’s just a sampling of what we found:
- Overall, PARCC and Smarter Balanced assessments had the strongest matches to the CCSSO Criteria.
- ACT Aspire and MCAS both did well regarding the quality of their items and the depth of knowledge they assessed.
- Still, panelists found that ACT Aspire and MCAS did not adequately assess—or may not assess at all—some of the priority content reflected in the Common Core standards in both ELA/Literacy and mathematics.
Overall, programs received the following marks on content and depth across math and ELA.
Our reviewers spotted areas of strengths and improvement for all four programs:
- ACT Aspire’s combined set of ELA/ Literacy tests (reading, writing, and English) require close reading and adequately evaluate language skills. Its math test items are also generally high-quality and clear. In ELA/Literacy, reading items fall short on requiring students to cite specific textual information in support of a conclusion, generalization, or inference and in requiring analysis of what has been read.
- MCAS’s ELA/Literacy tests require students to closely read high-quality texts, and both math and ELA assessments include a good variety of item types. While mathematical practices (such as modeling and making mathematical arguments) are required to solve items, MCAS does not specify their connections to the content standards.
- PARCC’s ELA/Literacy assessment includes appropriately complex texts, require a range of cognitive demand, and include a variety of item types. In math, the test is generally well-aligned to the major work of the grade. PARCC would better meet the criteria by increasing the focus on essential content at grade 5.
- Smarter Balanced’s ELA/Literacy tests assess the most important skills called for by the Common Core standards, and its assessment of writing and research and inquiry are especially strong. In math, the test is also generally well-aligned to the major work of the grade. In ELA/Literacy, a greater emphasis on academic vocabulary would further strengthen Smarter Balanced relative to the criteria.
All four tests we evaluated boasted items of high technical quality, and the next generation assessments that were developed with the Common Core in mind have largely delivered on their promises. Yes, they have improvements to make, but they tend to reflect the content deemed essential in the Common Core standards and demand much from students cognitively. They are, in fact, the kind of tests that many teachers have asked state officials to build for years.
Now they have them.
Which test is your state using?
State Use of Next-Generation Assessments, 2016-17 | |
---|---|
Source: Education First Consulting, LLC | |
PARCC (9 states) | CO, DC, IL, LA, MD, NJ, NM, NY, RI |
Smarter Balance (17 states) | CA, CT, DE, HI, ID, IA, MI, MT, NV, NH, NC, ND, OR, SD, VT, WA, WV |
ACT Aspire (3 states) | AL, AR, SC |
MCAS (1 state) | MA |
Used to Use PARCC or Smarter Balance But Dropped in the last two years (7 states) | AR, ME, MS, MO, OH, WI, WY |
Never Used These Assessments (15 states) | AK, AZ, FL, GA, IN, KS, KY, MN, NE, OK, PA, TN, TX, UT, VA |