When Governor DeWine announced that Ohio schools would remain closed for the rest of the 2019–20 school year, many students and parents immediately began to wonder what school will look like in the fall. Given the unpredictably of COVID-19, it’s impossible to know.
When Governor DeWine announced that Ohio schools would remain closed for the rest of the 2019–20 school year, many students and parents immediately began to wonder what school will look like in the fall. Given the unpredictably of COVID-19, it’s impossible to know. But one thing’s for sure: When schools do reopen, teachers and students are going to have a lot of work ahead of them. Here are four areas where lawmakers, school administrators, and teachers will need to adjust in order to address the fallout of coronavirus closures.
Data on student growth and achievement
It’s no secret that the school closures brought about by the pandemic are likely to have a serious impact on student learning. Three recent analyses have predicted poor outcomes for reading and math achievement and growth, with particularly devastating impacts for low-income students and students with disabilities.
Just how large Ohio’s learning losses will be is not yet known. In late March, Governor DeWine signed legislation waiving state testing requirements for the 2019–20 school year. This was the right call, but it also means we have no idea how much kids learned this past year. When schools reopen, one of the first orders of business should be determining where students are academically. That could mean administering state tests when students return in August or September. But given that state tests have historically been used to gauge achievement and growth on a broad scale, and are not typically used to identify the immediate needs of individual students, it might be better to encourage districts to use interim and diagnostic assessments. Those exams often have a faster turnaround time, so teachers won’t have to wait for state testing results to begin addressing their students’ academic needs.
Testing is only one piece of the puzzle, though. As a recent piece by Laura Slover in The 74 points out, data aren’t useful without “analysis and subsequent adjustments.” Once schools have identified learning gaps and losses, the next step will be to use those data to plan for and adjust to student needs. “Data-driven instruction” can no longer be just a buzzword. Next year, schools need to make it a guiding principle.
Another obvious impact of the pandemic is the financial toll. As Fordham fellow Dale Chu recently wrote, schools across the nation are facing a “triple threat” of financial fallout. First, students who have fallen behind academically thanks to the closures will need extra help, which can cost extra money. Second, declining state revenues will result in historic budget cuts. And third, districts are likely to face rising costs in other areas, such as pensions.
Ohio is no exception. School finance in the Buckeye State is about to look very different than it did at the beginning of the year. The Cupp-Patterson plan to revise the state funding formula is probably off the table for at least the next few years. Funding that could be considered supplemental—such as student wellness funds, performance incentives, and reimbursement programs—could disappear under future budget proposals. Ohio elementary and secondary schools are set to receive hundreds of millions in federal funding, and future relief bills will hopefully provide more. But even with this infusion of cash, schools of all types are in for a rough few years.
Over the next few months, state lawmakers and district leaders would be wise to consider the recommendations of my colleague Aaron Churchill. They include targeting funds to help the state’s most disadvantaged students, restarting the school funding formula, and providing additional flexibility around how schools can spend funds.
School leaders and advocates will also need to rethink how long and how often kids attend school. Prior to pandemic-related closures, research indicated that increased instructional time could lessen or eliminate achievement gaps. Many low-performing schools—district and charter—opted for extended days and extended years as a means of providing extra instruction and remediation for students who were academically behind their peers.
Now, in the wake of widespread coronavirus closures, extended time may no longer be an intervention that’s reserved only for chronically underperforming schools. All schools will need to consider extending school days and years to make up for the learning losses brought about by COVID-19, even if those extensions only last for a year or two. Summer school may or may not be off the table for 2020—it all depends on a virus that’s shown itself to be very unpredictable—but districts should consider planning intensive summer school programs for 2021. Beefed-up afterschool or weekend programs are worthy of consideration. It may also be time to finally rethink grade-level progressions, and allow students to advance based on what they know rather than how long they’ve been sitting at a desk.
Obviously, there are pros, cons, and complications to each of these options. Not one of them is a silver bullet. But school leaders and advocates should be ready to have serious conversations about instructional time over the next few months to mitigate coronavirus learning loss.
Innovation and improvement
It seems odd to discuss innovation and school improvement when educators and administrators are still trying to adjust to unforeseen circumstances. But all this change is exposing cracks in the education system, and those cracks are presenting opportunities for growth. As schools reopen and educators start evaluating the best ways to get students back on track, it’s important to take advantage of the lessons we’ve learned during these unprecedented times.
For starters, it’s critical for Ohio to start gathering information from students, parents, and teachers about how distance learning implementation worked on the ground. This feedback will be instrumental in planning for future remote learning efforts, should schools need to close again in the fall due to the virus. Identifying best practices and working to refine them could also help address issues that schools have when they’re operating normally, such as how to prevent learning loss during snow days or student suspensions. Online platforms that schools found particularly helpful during the shutdowns could be leveraged into blended learning models or to provide remediation and enrichment for individual students who need it. And the gaps this crisis has revealed in access to Wi-Fi and internet-enabled devices shouldn’t cease to matter once schools are back in session. In our increasingly digital world, all students need internet access—which means expansion efforts must continue.
There’s no doubt that the next few years are going to be hard. Funding shortfalls and learning losses are steep mountains to climb, and it’s going to take a lot of work on the part of teachers, parents, and communities to help kids thrive in the midst of adversity. But if lawmakers and leaders focus on addressing the four issues outlined above, climbing those mountains will be more than possible.
As part of the gargantuan aid package recently passed by Congress, Ohio will soon receive $105 million through the Governor’s Emergency Education Relief Fund. The purpose of this pot of money is “to address student needs arising from the COVID-19 related disruption of the current academic year.”
The governor’s fund is rather unique, as federal money usually flows through state education agencies and is then allocated to schools via statutory formulas (that is how the larger relief fund for K–12 education is expected to work). Instead, the governor’s office will have broad discretion over how these dollars are spent, meaning the DeWine administration will need to decide how to parcel out a sizeable chunk of change.
They must first be mindful of a few federal rules for this program. First, the dollars must be allocated to K–12 public schools, colleges and universities, or “other educational entities” such as preschools, afterschool providers, or technical training centers. Second, governors must award grants swiftly—within one year of receiving funds from the U.S. Department of Education (the disbursement should occur within the next two months). Third, federal guidance suggests, though doesn’t appear to require, that funding be targeted to areas that have been “most significantly impacted by coronavirus.”
How then should the governor spend these funds? While there are a number of worthy causes, two critical needs are IT infrastructure and career supports for transitioning high school seniors. Given the broader need for technology, however, it would likely warrant a larger portion of the fund.
Ensure more students have computers and Wi-Fi access
A number of news stories indicate that schools—especially in big cities and rural areas—have struggled to transition to remote learning due to uneven Internet access and computer shortages. This has not only hampered learning this spring but it also jeopardizes summer school and even instruction this coming fall, should schools remain physically closed or need to adopt some type of “blended” learning model. One sensible use of these funds, encouraged by Secretary DeVos, is to support efforts that enable students to engage in digital learning.
To best leverage these somewhat limited funds, the governor should focus on two goals: 1) distributing dollars to the places with the greatest need, and 2) getting funds to schools as fast as possible. Likely the most efficient way to accomplish this is to award formula grants based on poverty rates (low-income students are less likely to own computers) and perhaps data on broadband access. While it may not be a perfect system, this would steer more aid to needier schools. And a formula grant would likely be a fairer method than a first-come, first-served grant—less needy districts may end up at the front of the line—and it would also move dollars much faster than an application-based program, which could take months to administer. If schools had to apply for funds, the most urgent need for remote learning—this summer and fall—may have come and gone by the time schools actually receive aid.
Governor DeWine could go a step further to truly maximize the impact of an IT-based grant. As my Gadfly colleague Chad Aldis observes, there is a troubling lack of hard data about at-home education during the pandemic. We don’t have basic information about how many students log on to access assignments or participate in any type of online learning. Nor do we have a clear idea of how many Ohio students have no home computer or Wi-Fi access. As a condition of receiving these funds, the governor could require districts and schools to report data that would cast much needed light on students’ remote learning experiences and better guide any further investment in IT.
It’s true that the modest size of the governor’s fund is unlikely to meet the technological needs of all Ohio students. But these dollars—supplemented by the larger federal support arriving soon in addition to existing school dollars—could help most students get online, a basic first step in successful distance learning. In Cleveland, for example, a $1 million grant could get Chromebooks into the hands of about 5,000 students. A $75,000 award for a small, rural district could help roughly 625 students have Internet access for three months. Should a district already have the necessary devices and online supports, it could spend these dollars on other priorities or it could take a pass on the funds (perhaps if reporting requirements were included).
Support career transitions for high school seniors
In the coming months, about 100,000 Ohio seniors will graduate high school. Roughly two in three will be heading to college (whether it’s on-campus is another story), meaning that another third will be pursuing employment. Through no fault of their own, thousands of newly minted high school graduates will face scarce job opportunities due to the economic damage wrought by the pandemic.
To address this challenge, the governor’s fund could support short-term, continuing-education programs that specifically target non-college-going students. It might take some moxie to design a program on the fly, but one possibility is to model it after TechCred, a relatively new state initiative that aims to “upskill” Ohio workers. That program reimburses businesses when their employees attain technology-focused certifications and credentials that can be earned within one year. But rather than reimbursing employers for training costs—they’re likely ineligible for governor’s funds—community colleges or technical centers could be the target. Whenever non-college-going graduates in the class of 2020 earn a credential approved under TechCred, the educational institution would receive funding. At $2,000 per credential—the reimbursement rate for TechCred—setting aside $10 million of the governor’s fund would support about 5,000 young adults.
During an economic downturn, supporting the continuing education of high school graduates would give them a boost that could help secure better entry-level jobs when hiring starts to pick up again. It might also do justice to some of this year’s seniors who had career-technical and workforce training opportunities scuttled due to school and business closures. The narrow scope of such a program—limited to the class of 2020—and the expectation that credentialing programs last no more than a year would mesh well with the temporary nature of the governor’s fund and the federal requirement that funds be awarded within the next twelve months. Lastly, if successful, Ohio could consider extending the program to future graduating classes (though state funds will be limited in the near future).
* * *
As with any funding stream, the governor’s fund won’t be a cure-all that fixes every educational problem, much less all the new challenges that schools face in light of the health crisis. But if used wisely, these dollars can address some of the most glaring issues that schools and students face. Spending these funds to help bridge the digital divide and to smooth career transitions for this year’s seniors would be smart uses of federal aid.
Editor’s Note: The Thomas B. Fordham Institute occasionally publishes guest commentaries on its blogs. The views expressed by guest authors do not necessarily reflect those of Fordham.
This week we are celebrating Teacher Appreciation Week. Teachers across Ohio and the country have been incredibly resilient in the throes of the COVID-19 virus. They have had to quickly transition to a remote learning model. They have been using their time away from their buildings to gather content and get training from every source they can find. We all need to celebrate these devoted, committed, and hardworking professionals.
School districts are doing their best to adapt to this new learning world. Although great effort is being made to reach their students, the remote learning being provided by districts is in most cases very different to the digital learning provided by a full-time online school. This shouldn’t be a surprise.
Ohio Connections Academy has been operating as a public K–12 virtual school since 2005 and has many staff members with an excess of ten years of online teaching experience. We have come a long way since 2005 in terms of what we understand about high quality digital teaching and learning. We have learned optimal ways to engage students and families in both synchronous and asynchronous instruction. Of course, we’re still seeking new and better ways to innovate and use feedback to gather data to inform and improve our practices. The learning platforms have become much more sophisticated, and our staff members continue to use their dedication to lifelong learning to study and incorporate practices that enhance the digital learning student experience.
Our full-time online model is done in a Learning Management System (LMS). Teachers, parents, and students have 24/7 access to a dashboard in our LMS to view attendance, grades, and performance. Our students communicate with their teachers and classmates in a secure online system. The LMS guides students through structured lessons for all of their assigned courses. Our teachers receive rigorous training throughout their tenure here at Ohio Connections Academy. Our skilled teachers are able to individualize and personalize the self-paced lessons to ensure all students learn at their own pace. We address student learning in a variety of modalities. This includes the use of virtual manipulatives in math and virtual reality simulations in science courses.
Our staff has strict guidelines to frequently contact students and families. We also conduct synchronous learning sessions with students several times a week. Teachers provide one-on-one or small-group tutoring sessions in a live format where students can share a screen and show what they know in real time. There is no risk of “zoom bombing” due to our students learning in a protected firewalled system. Support staff assist our teachers by, among many things, sharing best virtual teaching practices, reviewing data, and ensuring student engagement.
There has been investment in our platform and research to continually improve our practice. It has been a challenging environment for us to rise above the chatter surrounding e-schools in Ohio. I am hoping this will be an opportunity to elevate the conversation about digital learning and focus more on the policies and practices that work best for students.
Many teachers around the state are making Herculean efforts to reach their students, and for that, we’re all deeply thankful. But they also need the tools to succeed. To that end, we at Ohio Connections Academy are happy to provide advice and best practices such as these for districts, schools, and parents as they continue their journey in providing remote learning. In Ohio, we’re in this together.
Marie Hanna is the Superintendent of Ohio Connections Academy, a public online charter school that serves K–12 students from all over the state.
Editor’s Note: The Thomas B. Fordham Institute occasionally publishes guest commentaries on its blogs. The views expressed by guest authors do not necessarily reflect those of Fordham.
Author’s Note: The School Performance Institute’s Learning to Improve blog series typically discusses issues related to building the capacity to do improvement science in schools while working on an important problem of practice. However, in the coming weeks, we’ll be sharing how the United Schools Network, a nonprofit charter management organization based in Columbus, Ohio, is responding to and planning educational services for our students during the COVID-19 pandemic and school closures.
The transition to remote learning due to the coronavirus pandemic has significantly shifted how United Schools Network (USN) plans and delivers educational experiences to its students. We've outlined our remote learning system—and it is helpful to think of it as a system—in our Education Plan. As soon as that document was created, and the new system was outlined, we immediately started thinking about how to improve it.
A first step in understanding this new remote learning system is to visualize the processes that happen in a single classroom. This visualization process, combined with asking key questions about the process steps, can help reveal areas for improvement, as well as areas where the system could potentially break down. I’ve illustrated the process steps for an eighth grade math classroom, along with key questions, in the flow chart below.
Remote Learning Flowchart: 8th Grade Math
Mapping a system in this way serves a number of purposes. First, it is helpful for school teams to understand how each teacher operationalizes the Education Plan. In Steps three through six, it is clear how the eighth grade math teacher is thinking about lesson design in the remote learning environment. Second, the complexity of the system becomes much clearer when the process steps are listed. You can visualize where the system may break down. Third, the flow chart highlights process steps in the system that may be vulnerable to access and equity issues for different student groups. For example, a breakdown could easily happen right at the start of the process if the information from Step 1 (device access) and Step 2 (internet access) aren’t stored in the same location. Fourth, the flow chart allows teams to start thinking about the key questions from each process step and in turn how to improve the system.
Early data being collected at each USN school indicate that there is significant variation in terms of student engagement with remote learning. However, before we can work on improving student engagement levels, we have to fully understand each teacher’s process for designing and delivering instruction, as well as have clearly articulated operational definitions for key remote learning concepts. The four concepts we've keyed in on include those for lesson, feedback, grading, and engagement in a remote learning environment.
As soon as we have those four in place, we can then start measuring and working to improve engagement levels. This will be a critical next step given that Governor DeWine recently extended the school closure order in Ohio through the end of the school year. We’ll share those definitions in the next article in this series.
John A. Dues is the Managing Director of School Performance Institute and the Chief Learning Officer for United Schools Network, a nonprofit charter-management organization that supports four public charter schools in Columbus. Send feedback to [email protected].
The Accelerated Study in Associate Programs (ASAP) began in the City University of New York (CUNY) system with the intent to comprehensively support students to persist and complete community college within three years. In 2015, the program expanded to three community colleges in Ohio: Cincinnati State Technical and Community College, Cuyahoga Community College, and Lorain County Community College. The initial results, released in early 2019, were promising. Evaluators found that the program adapted easily to a new environment, was more cost-effective in its new home, and was even more successful in boosting student success.
The program utilized an array of tools to overcome barriers to student success like onerous remedial course requirements, part-time course-taking, financial burdens, and class-scheduling difficulties. These tools included mandatory advisory meetings, tuition assistance and other financial incentives (e.g., gas or grocery gift cards), scheduling preference, and incentives to increase full-time enrollment. Both barriers and supports were identified via interviews with current students. Program-eligible students had to be degree-seeking, willing to attend full-time, majoring in degree programs that could be completed in three years or less, and qualify for a Pell Grant. A lottery determined which students entered ASAP (806 participants) and which comprised the control group (695 non-participants). Students in the control group had access to the usual suite of services the colleges provided but not the more-intensive supports enumerated above.
At the end of three years, the positive findings continued. While full-time enrollment in both groups declined across each semester, the declines were far more modest for students in the program group. Nearly 50 percent of ASAP students were enrolled full time at the end of three years versus less than 20 percent of the control group. In terms of degree completion, 35 percent of the program group earned degrees (including certificates, associate degrees, and even a handful of bachelor’s degrees) by the end of three years, as compared to 19 percent of the control group. Meanwhile, 18 percent of the program group had transferred to a four-year institution (including some overlap with the degree-earners), as compared to 12 percent of the control group.
Approximately 75 percent of students entering the study were deemed to need remedial coursework, typical for students entering community college in Ohio and often a major stumbling block to persistence and completion. The program students, per the ASAP model, were encouraged via advising and incentives to take remedial courses early, but data show that this encouragement had no effect on developmental requirement completion in the form of course credits earned. Both program and control groups completed a miniscule number of remedial courses. However, the institutions offered two alternative—non-remedial—ways for developmental education requirements to be satisfied, including retaking placement tests for a higher score or passing college-level courses in the relevant subjects. The program group students were 12 percentage points more likely to have completed their developmental education requirements by the end of three years than were control group students. The question arises whether the control group students were adequately aware of the alternatives, let alone given guidance to access and complete them as the program students clearly were. It wouldn’t take a program as extensive as ASAP to rectify such a structural problem.
The new report covers a host of other program-related outcomes, including comparisons between the Ohio and CUNY versions in terms of staffing and execution. But the most important continues to be the cost-effectiveness of the effort. ASAP supports cost about $8,000 more per student over six semesters than the colleges’ business-as-usual services, but the program increased graduation rates so much that the cost per degree-earned was 22 percent lower for program group students than it was for the control group.
Sadly, the report concludes by noting that, despite its strong bang for the buck, ASAP will not be continuing at the Cincinnati and Cuyahoga County institutions because its funding from the Ohio Department of Higher Education and other sources ended with the three-year demonstration project. Lorain County Community College announced plans to not only continue ASAP with its own resources, but also to eventually expand it. However, outside considerations unseen by any before March of 2020 may result in budget crunches among higher education institutions in Ohio that will curtail the ASAP program and more besides.
SOURCE: Cynthia Miller, Camielle Headlam, Michelle Manno, and Dan Cullinan, “Increasing Community College Graduation Rates with a Proven Model: Three-Year Results from the Accelerated Study in Associate Programs (ASAP) Ohio Demonstration,” MDRC (January 2020).
One of the tougher accountability nuts to crack is how to gauge educational quality in early elementary grades. Federal education law does not require state exams until third grade, and states choose not to administer end-of-year assessments in grades K–2. Despite the importance of these formative years in children’s lives, the absence of standardized testing makes measuring their academic growth difficult.
Some early-childhood education analysts have proposed that states rely on classroom observation or attendance data to evaluate quality. But a number of states, including Ohio, do require assessment when students enter kindergarten. The Ohio assessment, for example, includes content across four domains: social foundations, math, language and literacy, and physical well-being and motor development. A child’s teacher administers the assessment. The results have largely diagnostic aims—to inform instruction for the coming year—and are not widely used in accountability systems. Could states use these baseline data, combined with later state exam scores, to measure growth?
A team of analysts from Mathematica recently explore this possibility using data from Maryland, a state that administers a Kindergarten Readiness Assessment (KRA). They examine student-level data from the first cohort of children taking the KRA in 2014–15 along with their third grade test scores from 2017–18. Approximately 54,000 students had scores on both assessments and are thus included in the analysis. Another 26,000 students, however, are excluded because they were missing either KRA or third grade scores (due mainly to exits from or entrances into the school system after Kindergarten).
The research team first demonstrates that academic growth can indeed be measured using KRA and third grade scores. Relying on the same methodology that the state uses for accountability in higher grades—known as “student growth percentiles—they calculate K–3 growth results for Maryland elementary schools.
However, their analysis raises a few concerns about the validity of the results—the extent to which they reflect schools’ true contributions to student growth. First, due to the significant time between assessment, a number of students switched schools within the Maryland system. To address mobility, the analysts apportion responsibility for growth based on the amount of time spent in each school. This “shared accountability” is sensible, but without annual testing, uncertainty remains about which school actually contributed more to transfer students’ growth. Second, the researchers discover only a modest correlation between KRA and third-grade test scores. This, they suggest, indicates that the two assessments may be “measuring different aspects of academic ability.” In comparison to correlations in higher grades—e.g., third and sixth grade test scores—the KRA-third grade correlation is weaker, leading the authors to conclude that the K–3 growth results are “likely less valid” than those calculated in the higher grades.
Though imperfect, a K–3 growth measure may be better than flying nearly blind about educational quality. And a measure similar to what is used in this report could be superior to Ohio’s well-intended but rudimentary approach to measuring growth in the early grades. At the same time, policymakers should heed the report’s suggestions about implementing a K–3 growth measure: States should either place less weight on the results in an accountability system or report the growth data but not use them to inform ratings or consequences. Sound advice, given the limitations of such a measure.
Source: Lisa Dragoset, et. al., Measuring School Performance for Early Elementary Grades in Maryland, REL Mid-Atlantic (2019).
A crisis—less organic but no less virulent than the coronavirus pandemic—has been raging through the United States for years. Between 1999 and 2016, the rate of drug-related mortality grew 225 percent, due mostly to opioid overdose deaths. And while the most direct negative effects of the opioid crisis, like those of the current pandemic, bypass school-aged children, indirect effects have been widespread and innumerable. Babies exposed to opioids in utero are often born with significant health problems and lingering cognitive impairments; children can lose one or more parents to addiction, jail, or death; families can be deprived of financials resources when all their money goes to drugs; and communities can face overwhelming drains on scarce public resources when fighting the drug scourge.
A new report takes a look at the connection between elementary education outcomes and the opioid crisis.
Authors Rajeev Darolia of the University of Kentucky and John Tyler of Brown University draw upon previous work that models how neighborhood contexts can impact the education outcomes of children. Here their context is the opioid crisis in the community, and the education impact is measured via the interaction between the child’s level of exposure to the pandemic and the child’s vulnerability to any given level of exposure. Darolia and Tyler are quick to note that exposure occurs on a continuum from “the personal and traumatic to the less direct but potentially pervasive,” and that there are mitigating factors at varying levels in each individual child’s life—such as family support or strong schools—working to blunt the negatives. This is an important caveat to have in place when evaluating such research, as we have been warned recently about misuse of a common measure of childhood trauma. Additionally, the researchers consider only one educational outcome: a composite result of state test scores in third-grade math and English language arts. These are individual test score results aggregated at the county level over the study time period, sourced from the Stanford Educational Data Archive.
They first present a scatter plot of the unconditional relationship between test scores and drug-related overdose mortality rates by county. The dashed line is downward sloping and linear, indicating a negative relationship between the two, although there are many outliers in all four quadrants. Next up is a sort of “heat map” of counties across the U.S. where test scores are low and drug-related mortality is high. Mortality figures, broken down by deciles, indicate that counties in the highest (tenth) decile of mortality have third-grade test scores that are about one-tenth of a standard deviation lower than test scores in the lowest mortality rates. While this is a small effect, it is big enough to register when looking at entire counties as individual data points. Darolia and Tyler also point to a notable jump in drug-related mortality from the ninth to the highest decile, which they surmise could indicate a stronger correlation between the most visible outcome of the opioid crisis—death rates—and the lowest test scores observed. They also observe a stronger correlation in opioid exposure and test scores in rural areas than in nonrural. To wit: Rural counties in the highest mortality decile show third-grade test scores almost two-tenths of a standard deviation lower than rural counties in the lowest mortality decile. This is about twice as large as the analogous comparison among nonrural counties. The researchers suggest that this indicates a mitigating factor of available local resources such as community behavioral health services, high-performing schools, or plentiful jobs.
In the end, Darolia and Tyler can only present conditional correlations between the effects of the opioid crisis and elementary education that account for some, but potentially not all, confounding factors. They wisely point out that the nature and level of mitigating supports is an important unknown. There are numerous counties with high levels of drug-related mortality over the period where test scores remain high and vice versa. Are there great community or individual supports buoying children through the worst of the crisis, supports which can be replicated elsewhere? Or perhaps there are areas where “drug-related mortality” is driven by meth or cocaine, still deadly but less all-encompassing in their effects. These outliers are perhaps more important than the obvious opioid hot spots about which we have already learned much.
Education researchers will have much to unpack after life returns to some semblance of normal following the coronavirus pandemic. But this is neither the first nor the last large-scale disruptor of our communities and the school-age children living in them. We must learn as much as possible from each disruption so as to minimize them in the future and to make sure that children’s educational trajectories are obstructed as little as possible despite the vicissitudes—man-made or otherwise—of the world.
SOURCE: Rajeev Darolia and John Tyler, “The opioid crisis and community-level spillovers onto children’s education,” Brookings Institution (April 2020).