The number of teacher aides in America continues to grow. They comprised only 1.7 percent of U.S. school-based staff in 1970, but forty years later, in 2010, that proportion reached nearly 12 percent. Yet we know precious little about their bearing on student performance. A new study by CALDER attempts to fill the vacuum. Analysts examine the impact of teaching assistants (TAs) on learner outcomes in North Carolina. The study also includes health care providers, but those data are less reliable, so this review does not include them.
Teaching assistants perform a variety of tasks that typically vary by state, district, school and even classroom. They include preparing classroom activities and clerical tasks, working with students in small groups, helping to assess student work, and managing student behavior.
North Carolina uses formulas to allocate positions, not dollars, to local districts, meaning that districts get a certain number of slots for teachers, principals, and support personnel based on student enrollment. There are no incentives to hire a new versus a veteran teacher, for instance, because the state will pay them according to their salary schedule. For teaching assistants, however, the state only provides a certain dollar amount per student to each district, thus limiting the number of TA positions a district can hire using state appropriations. CALDER analysts include state-funded, but not district-funded, positions when analyzing school-level changes to rule out any potential impacts from district dollars; they are also able to control for a number of school, student, and teacher variables, which enhances the credibility of the findings. They use school-level data from 2001 to 2012 for nearly 1100 elementary schools.
On the descriptive front, analysts find that the mean teacher-to-student ratio is 5.2 teachers per 100 students; for TAs it is 2.9 per 100 students. As for impact, results indicate that one additional teaching assistant per 100 students increased reading scores by 0.009 standard deviations—small potatoes, but we also don’t have other TA studies to compare it to. There is essentially no impact on math overall. Yet when broken down by subgroup, findings show larger effects for minority students than white students in both reading and math. TAs also make small yet statistically significant differences when it comes to curbing student absences and tardiness, but they have no impact on suspension rates. (A 10 percent increase in teachers, on the other hand, reduces the student absentee rate by about 0.15 days per year, which represents a 3 percent decline in the average absentee rate; principals make a similar difference in terms of magnitude.)
Given the paucity of rigorous research on the topic, what we have here is obviously not the last word. But it does indicate that TAs could help improve academic performance somewhat, particularly for minority students, in the early grades. What we don’t know is whether another intervention that costs less might be equally or more effective than this one.
SOURCE: Charles Clotfelter, Steven Hemelt, and Helen Ladd, "Teaching Assistants and Nonteaching Staff: Do They Improve Student Outcomes?," CALDER (October 2016).