Executive Summary
This brief addresses the following ideas that have become conventional wisdom in some quarters:
Conventional wisdom (CW): “College admission exams are racist.”
No. The ACT and SAT are thoroughly vetted to weed out potential racially-, ethnically-, or class-biased questions.
CW: “College admissions exams limit students’ pathways into quality higher education.”
Yes, but only for a few students. In fact, a minority of American students attend colleges that require high entrance exam scores for admission.
CW: “Other parts of the college application promote equity better than entrance exams.”
Not necessarily. All components of a typical college application packet that have been studied by researchers, from letters of recommendation to personal essays, exhibit gaps when computing the averages of different student groups, suggesting that the gaps are driven by average differences in academic achievement and preparation.
CW: “Admissions officers should just focus on grades.”
Wrong. Entrance exams provide information about schools and students that is distinct from grade point average.
CW: “The future of college admissions is ‘test optional.’”
Unclear. Because entrance exams are merely a tool, their usefulness will vary for different higher education stakeholders, and some stakeholders will probably continue to use exam scores for the foreseeable future, even if “test-optional” policies appear to be ascendant right now.
The Bottom Line
Recent years have seen many colleges and universities adopt “test-optional” admissions, but the evidence to date suggests that such policies will have, at most, small effects on the equity objectives that are often rhetorically tied to those policies. Other elements of the application packet exhibit gaps that are similar to those observed in exam scores, and evaluations of test-optional admissions policies show little effect on equity.
Policy Implications
1. Institutions should not decide whether to rely on college admissions exams based on concerns that they are racist or otherwise bad for equity.
2. More generally, because the best predictions of students’ program success will come from analyses that include a variety of measures, institutions should require and examine a range of relevant and predictive data for admissions. The key criterion for inclusion of a measure ought to be its value for predicting program success rather than its correlation with student background factors.
3. States should require and pay for all high school students to take the SAT or ACT, in order to help identify all students with high academic potential, especially those from underrepresented groups.
THINK AGAIN:
––––––––––––
The following brief evaluates the extent to which the use of college entrance exam scores by higher education institutions contributes to differences in college admissions and completion observed across racial/ethnic and socioeconomic groups. It evaluates five statements that have become the conventional wisdom among parts of the public or policymaking community, including the following: “College admissions exams are racist,” “College admissions exams limit students’ pathways into quality higher education,” “Other parts of college application packets promote equity better than entrance exams,” “Admissions officers should just focus on grades,” and “The future of college admissions is ‘test optional.’”
“College admissions exams are racist.” No.
In a landmark 2021 settlement by the University of California (UC), the nation’s most prestigious state university system, officials agreed to eliminate the requirement that students submit college entrance exam scores to gain admission. When then-UC president Janet Napolitano initially proposed suspending the testing requirement in 2020, her statement argued that eliminating testing requirements would “enhance equity,” a claim that has become increasingly prevalent among opponents of college entrance exams in particular and standardized testing in general. In education, “inequity” typically refers to gaps in the outcomes experienced by different groups of students, but critics of college admissions exams have often gone even further, painting the SAT and ACT as simply racist. For example, the left-leaning groups that brought the UC lawsuit called the tests “discriminatory,” and their lawsuit cited then-UC regent Cecilia Estolano, who said of the SAT, “We all know it’s a racist test.” From lawsuits to deliberate test-blind or test-optional admissions policies, the claim that the SAT and ACT put underrepresented minority students at a disadvantage is pervasive.
Gaps in average exam scores for students of different groups are, of course, no myth. The average ACT composite score is about 20, but the highest-scoring racial/ethnic group, Asian students, scores 24.9 on average, while Black students, the lowest-scoring group, have an average score of 16.3. For some, the existence of such a gap is alone sufficient to prove the tests are racist. Yet students of different groups do not all have the same family and school experiences. Black children, for example, are three times as likely as their Asian and White classmates to grow up in poverty. Childhood disparities in access to everything from health care to good schools and teachers mean that—whatever those disparities say about the roots and consequences of American inequality—it would be surprising if all groups of seventeen-year-olds exhibited equivalent levels of college readiness. As we will see below, all measures of academic achievement exhibit such gaps.
Far from being instruments of systemic racism, the SAT and ACT are put through extensive measures to ensure that the exams do not discriminate against students because of their racial or ethnic background (or class or gender). The organizations that administer the exams vet them intensely, not just by diverse review panels that alert the test makers to any potential cultural biases but also through statistical techniques that flag test questions answered differently by students from different backgrounds. This latter process alerts test makers to any questions that, after controlling for students’ overall scores, two groups of students (e.g., students from high-income and low-income families) answer correctly at different rates. Officials at both testing companies implement multiple, overlapping systems designed to ensure that the exams are free of such biased questions before they could affect any student’s score.
“College admissions exams limit students’ pathways into quality higher education.” Yes, but only for a few students.
For admissions exams to harm equity in higher education, students would have to lose access to college because of their exam scores. In fact, only a minority of students attend colleges where a high exam score is required for admission in the first place. Regardless of race or social class, most students attend institutions where either no score is required at all or only a very low score is needed to gain admission.
The admissions testing landscape has changed dramatically since the Covid-19 pandemic, and as of writing, few colleges require testing. Yet even before Covid-19, it was rare for students to need good ACT or SAT scores to access higher education. More than one in three undergraduates in the U.S. attends “open-access” two-year institutions (i.e., community colleges), which require neither exam scores nor a good grade point average (GPA), and many four-year institutions do not require test scores either. Many less-selective four-year institutions are either open-access or very nearly so in practice, and a number of prestigious four-year institutions offer access through high school GPA or class rank (e.g., the University of Texas), completing community college with an adequate GPA (e.g., the UC system), or through long-standing test-optional policies (e.g., the University of Chicago).
This does not mean that admissions exam policies are irrelevant. Of course, some students will seek to attend highly selective colleges where good test scores are essential. Others will need to take the exams to qualify for scholarships or to enroll in selective programs within a less-selective institution, such as honors colleges or rigorous majors such as engineering. Yet even though exam scores are important to some students, students have long had access to most American institutions of higher education without ever taking a college entrance exam.
“Other parts of the college application promote equity better than entrance exams.” Not necessarily.
Most prospective students who attend colleges that are at least somewhat selective submit application packets that include such materials as the following:
- College admissions exam scores
- Personal essays
- Letters of recommendation
- Information about extracurricular activities
- Credit-by-examination scores (e.g., Advanced Placement [AP])
- High school transcripts, which themselves include the following:
- Name and location of the high school attended
- Courses taken
- Grades
Critics of college entrance exams implicitly suggest that the other application components do not exhibit the same gaps. Otherwise, there is nothing special about the exam scores, so it is not clear why excluding them would make admissions more equitable.
But are gaps among students unique to entrance exams? In fact, what research exists on other elements of the application packet strongly suggests they are responsible for “perpetuating inequities” in much the same way.
Consider a 2020 study that used software to analyze hundreds of thousands of student admissions essays from applications to the University of California system. The study, which was the first to use quantitative methods to analyze college admissions essays, found that the form and content of students’ personal essays were even more correlated with student socioeconomic status than SAT scores. (The researchers did not examine differences in essays by racial/ethnic group.)
Letters of recommendation exhibit gaps as well, although their magnitude appears smaller. A 2018 study identified disparities by race and gender for undergraduate letters of recommendation, although that study characterized the gaps as “small but statistically significant.” A 2020 study of recommendation letters for a graduate program also found differences depending on student background factors, including that the letters disproportionately described Black and Latino applicants as having “less agency” than other students. Such differences have led commentators to condemn the practice of requiring letters of recommendation in audacious terms, such as when a 2021 op-ed by a professor at the University of North Carolina deemed the letters “tools of oppression.”
Gaps in participation in extracurricular activities are also apparent. A 2018 study found that extracurricular participation in high school is strongly linked to race and even more strongly correlated with family income. The gap between more- and less-affluent students is about fifteen percentage points (40 percent versus 55 percent). More troublingly, the recent “varsity blues” scandal provided a particularly corrupt example of how selective schools can use policies favoring students who participate in extracurricular activities to admit students from wealthy and well-connected families; given the lack of transparency, “holistic” admission policies open the door to many such types of corruption.
The other data evaluated by admissions officers, from AP test scores to high school grades, all exhibit such gaps. Regarding AP, schools serving disproportionately low-income, rural, Black, and Hispanic students are less likely to offer advanced high school coursework such as AP and dual-enrollment courses. And even when students have access to these courses, the AP test scores themselves exhibit gaps. Although GPA is often touted as a better metric than test scores—as we will see below—it is correlated with race and socioeconomic status of students, as well.
The magnitudes of the gaps across measures vary, and in some cases, as with admissions essays, they are probably even larger than those for test scores. Still, these disparities are difficult to compare because of differing methodologies across studies. A 2019 report by the Georgetown University Center on Education and the Workforce found that the students selected solely based on the SAT would, on average, hail from wealthier families and be more likely to be White and Asian than under the status quo. That might seem like damning evidence against college admissions exams, but because the status quo includes affirmative action policies, it remains unclear whether the SAT (or ACT) gaps are actually greater than those for other admissions application components.
The fact that we observe similar kinds of gaps between student groups on every observable educational outcome suggests that these data simply reflect society’s broader inequities. Thus, patterns in SAT and ACT scores offer a window into a society that has not done enough to make up for its sordid racial history or continuing inequities, but blaming the exams for having gaps is a classic case of “shooting the messenger.” College entrance exams are no more “tools of oppression” than letters of recommendation or anything else in the application packet. Indeed, any meritocratic admissions process that uses academic data is likely to exhibit similar gaps.
“Admissions officers should just focus on grades.” Wrong.
Grades and GPA are often identified by equity advocates as a good alternative to test scores, as they tend to be less correlated with socioeconomic status and race/ethnicity. Yet, like other components of the college application packet, analyses reliably show that grades are still correlated with student background factors, including race and family income. At most, replacing test scores with grades might mitigate the racial or socioeconomic disparities within higher education admissions, but the disparities would not be eliminated.
Interestingly, a 2007 study by researchers from UC Santa Barbara shows how grading standards vary across schools and finds that, within a given high school, grades correlate with socioeconomic status to a similar extent as test scores. This underscores another advantage of college admissions exams over grades: they’re a more objective yardstick. Grades and high school course taking (as well as other elements of the application packet) are valuable data points for admissions officers to consider, but many are difficult to compare across schools. Exam scores can reveal school quality, which is why admissions officials—even those from “test-blind” or “test-optional” institutions—often use such data to help them rate the high schools of prospective students and to contextualize GPA.
This is why studies predicting postsecondary success have consistently shown that admissions exam scores provide additional information over and above what GPA can show. Yet grade inflation is squeezing high school GPAs into an ever-narrower range, suggesting they will eventually be less valuable to admissions officers as markers of achievement and, thus, predictors of college readiness. Although an earlier analysis showed that GPA was still a useful predictor of readiness fifteen years ago, studies since then have found continued high school grade inflation (amid other evidence of softening high school standards). Still, even if GPA continues to be a better predictor of college success than exam scores, the best predictions are typically derived from analyses that include a variety of relevant measures. This means that institutions should require and examine a range of relevant and predictive data for admissions. The key criterion for inclusion of a measure ought to be its value for predicting program success rather than its correlation with student background factors.
“The future of college admissions is ‘test-optional.'" Unclear.
For some colleges, SAT and ACT scores may be expendable, and going test optional or test blind will not harm their selection processes. Yet it is clear that some institutions find the information these exams provide to be valuable. One example is MIT, which reinstated pre-pandemic testing requirements in 2022. Their dean of admissions explained that “considering performance on the SAT/ACT, particularly the math section, substantially improves the predictive validity of our decisions with respect to subsequent student success at the Institute.” Moreover, he argued that considering admissions exam scores was good for equity as well, as “not having SATs/ACT scores to consider tends to raise socioeconomic barriers to demonstrating readiness for our education.” Other faculty and administrator groups, including the UC Faculty Senate task force (which was ultimately ignored) and a group of sixty law school deans in an open letter from 2022, have challenged the trend toward ignoring admissions tests. Because higher education stakeholders face different challenges, we shouldn’t be surprised that admissions exams will not be used in a uniform manner in all American colleges.
In fact, admissions officers have myriad data available to assess student readiness, even as GPA becomes a less-reliable measure and admissions exam scores have increasingly become optional. With all the other data available, standardized admissions exams may provide only marginal additional information for individual students. Such tests provide an objective measure of academic preparation, but they are one tool among many, and their results should not be thought of as the one indispensable indicator of college readiness. Still, because many institutions are likely to use these scores in the foreseeable future, and because excellent students at mediocre schools might otherwise be overlooked, there is an obvious implication for state policymakers: states should require and pay for all high school students to take the SAT or ACT, in order to help identify all students with high academic potential. Over the years, such state policies have proven themselves to be a boon to equity.
Considering that the tests are closely vetted to remove biased content, irrelevant to many admissions decisions, and correlated with student background in a similar way to other elements of the college application packet, it shouldn’t be a surprise that when institutions go “test optional,” it is not the radical step toward equity that many assume. A 2015 study found that liberal arts colleges that implemented test-optional admissions policies did not become substantially more diverse. Instead, the main effect of the policies was to render greater “perceived selectivity” for such institutions. In effect, switching to test-optional policies, rather than being a win for social justice, may be more a kind of virtue signaling by these elite institutions.
More institutions may choose to engage in this kind of virtue signaling, and test-optional policies may endure or spread. Or perhaps high school grade inflation will lessen the value of high school GPA, and more institutions will require entrance exam scores to validate student preparation. Or maybe both trends will occur for different types of institutions. Regardless, institutions should not decide whether to rely on college admissions exams based on concerns that they are racist or otherwise bad for equity. Whether the exams are required or not, the implications for equity are minimal.
References
“Admissions: Frequently Asked Questions.” University of Chicago. Accessed July 25, 2022. https://collegeadmissions.uchicago.edu/contact/frequently-asked-questions.
Akos, Patrick, and Jennifer Kretchmar. “Gender and Ethnic Bias in Letters of Recommendation: Considerations for School Counselors.” Professional School Counseling 20, no. 1 (2016): 1096–2409.
Alvero, AJ, Sonia Giebel, Ben Gebre-Medhin, Anthony Lising Antonio, Mitchell L Stevens, and Benjamin W Domingue. “Essay Content and Style Are Strongly Related to Household Income and SAT Scores: Evidence from 60,000 Undergraduate Applications.” Science Advances 7, no. 42 (2021): eabi9031.
Artiga, Samantha, Latoya Hill, and Kendal Orgera. “Health Coverage by Race and Ethnicity, 2010-2019.” Kaiser Family Fund, n.d. https://www.kff.org/racial-equity-and-health-policy/issue-brief/health-….
Belasco, Andrew S, Kelly O Rosinger, and James C Hearn. “The Test-Optional Movement at America’s Selective Liberal Arts Colleges: A Boon for Equity or Something Else?” Educational Evaluation and Policy Analysis 37, no. 2 (2015): 206–23.
California State University. “CSU Trustees Vote to Amend Title 5 to Remove SAT and ACT Tests from Undergraduate Admissions,” March 22, 2022. https://www.calstate.edu/csu-system/news/Pages/trustees-vote-remove-SAT….
Carnevale, Anthony P, Jeff Strohl, Martin Van Der Werf, Michael C Quinn, and Kathryn Peltier Campbell. “SAT-Only Admission: How Would It Change College Campuses?” Georgetown University Center on Education and the Workforce, 2019.
Chemerinsky, Erwin, and Daniel Tokaji. “Ending Standardized Law School Tests Could Diminish Diversity.” Bloomberg Law, November 22, 2022. https://news.bloomberglaw.com/us-law-week/ending-standardized-law-schoo….
“Community College FAQs: Community College Enrollment and Completion.” Community College Research Center, Teachers College, Columbia University. Accessed July 26, 2022. https://ccrc.tc.columbia.edu/community-college-faqs.html.
Croft, Michelle, and Jonathan J. Beard. “Development and Evolution of the SAT and ACT.” In The History of Educational Measurement: Key Advancements in Theory, Policy, and Practice, edited by Brian E. Clauser and Michael B. Bunch. New York, NY: Routledge/Taylor & Francis Group, 2022.
Duncombe, Chris. “Unequal Opportunities: Fewer Resources, Worse Outcomes for Students in Schools with Concentrated Poverty.” Commonwealth Institute, October 2017. https://thecommonwealthinstitute.org/research/unequal-opportunities-few….
Dynarski, Mark. “Is the High School Graduation Rate Really Going Up?” Brookings Institution, May 3, 2018. https://www.brookings.edu/research/is-the-high-school-graduation-rate-r….
FairTest. “Record 1,835+ Schools Are Test Optional or Test Free for Admissions,” November 14, 2022. https://fairtest.org/record-1835-schools-are-test-optional-or-test-free….
Gershenson, Seth. “Grade Inflation in High Schools (2005-2016).” Thomas B. Fordham Institute, 2018.
Grimm, Lars J, Rebecca A Redmond, James C Campbell, and Ashleigh S Rosette. “Gender and Racial Bias in Radiology Residency Letters of Recommendation.” Journal of the American College of Radiology 17, no. 1 (2020): 64–71.
Hurwitz, Michael, and Jason Lee. “Grade Inflation and the Role of Standardized Testing.” In Measuring Success: Testing, Grades, and the Future of College Admissions, edited by J. Buckley, L. Letukas, and B. Wildavsky. Johns Hopkins University Press, 2018. https://books.google.com/books?id=LudFDwAAQBAJ.
Hurwitz, Michael, Jonathan Smith, Sunny Niu, and Jessica Howell. “The Maine Question: How Is 4-Year College Enrollment Affected by Mandatory College Entrance Exams?” Educational Evaluation and Policy Analysis 37, no. 1 (2015): 138–59.
Hyman, Joshua. “ACT for All: The Effect of Mandatory College Entrance Exams on Postsecondary Attainment and Choice.” Education Finance and Policy 12, no. 3 (2017): 281–311.
Kawika Smith, et al. v. The Regents of the University of California and Janet Napolitano, Respondents’ Opposition to Petition for Writ of Supersedeas or Other Stay Order; Memorandum of Points & Authorities (Court of Appeal of the State of California First Appellate District October 7, 2020).
Kendi, Ibram X. “Testimony in Support of the Working Group Recommendation to #SuspendTheTest.” Boston Coalition for Education Equity, October 21, 2020. https://www.bosedequity.org/blog/read-ibram-x-kendis-testimony-in-suppo….
Klasik, Daniel. “The ACT of Enrollment: The College Enrollment Effects of State-Required College Entrance Exam Testing.” Educational Researcher 42, no. 3 (2013): 151–60.
Kolluri, Suneal. “Advanced Placement: The Dual Challenge of Equal Access and Effectiveness.” Review of Educational Research 88, no. 5 (2018): 671–711.
Meier, Ann, Benjamin Swartz Hartmann, and Ryan Larson. “A Quarter Century of Participation in School-Based Extracurricular Activities: Inequalities by Race, Class, Gender and Age?” Journal of Youth and Adolescence 47, no. 6 (2018): 1299–316.
Moezzi, Melody. “This College Admissions Season, Let’s End the Odious, Racist Practice of Recommendations.” NBC News, March 20, 2021. https://www.nbcnews.com/think/opinion/college-admissions-season-let-s-e….
Napolitano, Janet. “College Entrance Exam Use in University of California Undergraduate Admissions,” May 21, 2020. https://regents.universityofcalifornia.edu/regmeet/may20/b4.pdf.
National Center for Education Statistics. “Indicator 4 Snapshot: Children Living in Poverty for Racial/Ethnic Subgroups,” February 2019. https://nces.ed.gov/programs/raceindicators/indicator_rads.asp#:~:text=…(11%20percent.
Pattison, Evangeleen, Eric Grodsky, and Chandra Muller. “Is the Sky Falling? Grade Inflation and the Signaling Power of Grades.” Educational Researcher 42, no. 5 (2013): 259–65.
Price, Heather E. “Large-Scale Datasets and Social Justice: Measuring Inequality in Opportunities to Learn.” In Research Methods for Social Justice and Equity in Education, 203–15. Springer, 2019.
Radunzel, Justine, and Julie Noble. “Predicting Long-Term College Success through Degree Completion Using ACT [R] Composite Score, ACT Benchmarks, and High School Grade Point Average. ACT Research Report Series, 2012 (5).” ACT, Inc., 2012.
“Report of the UC Academic Council Standardized Testing Task Force.” Systemwide Academic Senate University of California, January 2020. https://senate.universityofcalifornia.edu/_files/committees/sttf/sttf-r….
Richmond, Emily. “‘Operation Varsity Blues’: The Real Story Isn’t the Admissions Scandal.” Education Writers Association, March 19, 2019. https://ewa.libsyn.com/operation-varsity-blues-the-real-story-isnt-the-….
Rothstein, Jesse. “SAT Scores, High Schools, and Collegiate Performance Predictions.” In Annual Meeting of the National Council on Measurement in Education, Montreal, CA, 2005.
Sanchez, Edgar, and Krista Mattern. “When High School Grade Point Average and Test Scores Disagree.” In Measuring Success: Testing, Grades, and the Future of College Admissions, edited by J. Buckley, L. Letukas, and B. Wildavsky. Johns Hopkins University Press, 2018. https://books.google.com/books?id=LudFDwAAQBAJ.
Sanchez, Edgar, and Raeal Moore. “Grade Inflation Continues to Grow in the Past Decade.” The ACT, May 2022. https://www.act.org/content/act/en/research/pdfs/R2134-Grade-Inflation-….
Sass, Tim R, Jane Hannaway, Zeyu Xu, David N Figlio, and Li Feng. “Value Added of Teachers in High-Poverty Schools and Lower Poverty Schools.” Journal of Urban Economics 72, no. 2–3 (2012): 104–22.
Schmill, Stu. “We Are Reinstating Our SAT/ACT Requirement for Future Admissions Cycles.” Massachusetts Institute of Technology, March 28, 2022. https://mitadmissions.org/blogs/entry/we-are-reinstating-our-sat-act-re….
Schmill, Stu. “We Are Suspending Our SAT/ACT Requirement for the 2020–2021 Application Cycle.” Massachusetts Institute of Technology, July 13, 2020. https://mitadmissions.org/blogs/entry/we-are-suspending-our-sat-act-req….
The ACT. “Profile Report - National, Graduating Class 2021,” 2021. https://www.act.org/content/dam/act/unsecured/documents/2021/2021-Natio….
“Top 10 Percent Law.” UT News, August 1, 2022. https://news.utexas.edu/topics-in-the-news/top-10-percent-law/.
“Transfer Requirements: Basic Requirements.” University of California. Accessed July 28, 2022. https://admission.universityofcalifornia.edu/admission-requirements/tra….
Tyner, Adam, and Seth Gershenson. “Conceptualizing Grade Inflation.” Economics of Education Review 78 (2020): 102037.
Tyner, Adam, and Matthew Larsen. “End-of-Course Exams and Student Outcomes.” Thomas B. Fordham Institute, 2019.
Tyner, Adam, and Nicholas Munyan-Penney. “Gotta Give’Em Credit: State and District Variation in Credit Recovery Participation Rates.” Thomas B. Fordham Institute, 2018.
“Young Adult Educational and Employment Outcomes by Family Socioeconomic Status.” National Center for Education Statistics, May 2019. https://nces.ed.gov/programs/coe/indicator/tbe#5.
Zwick, Rebecca. “A Review of ETS Differential Item Functioning Assessment Procedures: Flagging Rules, Minimum Sample Size Requirements, and Criterion Refinement.” ETS Research Report Series 2012, no. 1 (2012): 1–30.
Zwick, Rebecca, and Jennifer Greif Green. “New Perspectives on the Correlation of Scholastic Assessment Test Scores, High School Grades, and Socioeconomic Factors.” Journal of Educational Measurement 44, no. 1 (Spring 2007): 1–23.
About the Report
This report was made possible through the generous support of the Thomas B. Fordham Foundation. We are grateful to Kevin Carey, vice president of education and knowledge management at New America, for his helpful feedback on a draft of the report. We also extend our gratitude to Pamela Tatz for copyediting. At Fordham, we would like to thank Adam Tyner for authoring the report; Amber Northern, David Griffith, Chester E. Finn, Jr., and Michael J. Petrilli for reviewing drafts; Victoria McDougald for her role in dissemination; Jeanette Luna and Stephanie Distler for developing the report’s cover art and coordinating report production; and Christian Eggers for research assistance.