When it comes to using data to enact meaningful changes at the school level, few of us know where to begin. That’s why the University of Chicago Consortium on School Research (UChicago Consortium) just published a new report based on its own experiences translating research to the classroom. The paper, titled “Practice-Driven Data” and written by Eliza Moeller, Alex Seeskin, and Jenny Nagaoka, draws best practices from UChicago’s partnership with the Chicago Public School system (CPS) and the Network for College Success (NCS) to improve educational attainment for Chicago high schoolers. It is particularly timely given the Bill and Melinda Gates Foundation’s current effort to fund similar “school improvement networks” nationwide.
The paper lays out five lessons the partnership has learned since its formation in 2006. The examples they use and the recommendations they make are meant to be scalable, but the authors do note that every locality will face unique circumstances that may affect how they collect and analyze data in their own district.
1. Prepare: Build the capacity to facilitate hard conversations.
Schools can’t use data to guide practice until they have a framework to analyze new information and implement new strategies. The authors recommend developing two key roles: a data strategist and a charismatic team leader who “can bring data to life” for educators. Both individuals can help facilitate discussion among teachers and field questions about necessary changes within a school. The UChicago Consortium also builds issue-specific teams within CPS schools, each focusing on the data in an area, such as freshman success and instructional leadership.
2. Focus: Prioritize research-based indicators.
Once a data-collection infrastructure exists, the almost-limitless supply of new information is more likely to overwhelm than to help educators. Selecting which data points are most valuable, and knowing how to display them effectively, allows teachers to understand how to adapt their day-to-day practice.
Which data points should we prioritize? The authors suggest the most critical ones are predictive, clear and usable, regularly available, causally linked to outcomes, and actionable at the school- or classroom-level. For example, in the high schools the UChicago Consortium works with, daily attendance is a strong predictor of on-time graduation and is more actionable for teachers in real time than, say, eighth-grade test scores. Thus attendance is a better data point to highlight.
3. Make meaning: Develop shared ownership over the implications of research.
The authors here are concerned about a loss of context as new research-based initiatives infiltrate classrooms too rapidly. They explain, “When indicators reach educators without the research as context, educators have a tendency to approach indicators from a perspective of complying with mandates, rather than useful data to solve an important problem.” Most of the meaning-making battle involves communication with teachers, including in-person meetings and visual data tools to highlight the “why” behind new requirements. Sometimes replicating research findings in one’s own school can reinforce the stakes of a new project.
4. Strategize: Use the right data at the right time.
Specialized teams can focus on different types and granularities of data, but the UChicago Consortium also encourages its schools to highlight certain data points at particular times during the school year. For example, they suggest reviewing student demographics and prior achievement in early September so teachers learn the general background of their new students, school-level common assessment data during December, and college application data for seniors in January and again in March. Short- and long-term data analyses create different and overlapping cycles of reflection and improvement within schools.
5. Disrupt: Identify and stop inequity.
The increased availability of data allows educators to identify achievement and behavior trends across demographic subgroups within a single school, much as it does at the national level. But the authors warn against disaggregation without direct and immediate examination of the practices that might lead to such differing outcomes; otherwise, differences in graduation rates across races, for example, may only demoralize the communities that see themselves at the bottom of the graph. One practice the authors recommend is disaggregating data both by race and by prior achievement, so teachers may understand how their own practices are directly affecting students of differing backgrounds.
***
Because the UChicago Consortium and NCS write this guide from their own experience translating research into the school- and district-levels, it reads much more practically than many best-practices resources available to educators. Their frequent use of examples reminds the reader that every school district will be a little different and that data really can shape our school reform efforts.
SOURCE: Moeller, Eliza, Alex Seeskin, and Jenny Nagaoka. “Practice-Driven Data: Lessons from Chicago’s Approach to Research, Data, and Practice in Education.” University of Chicago Consortium on School Research, October 2018.