- High-dosage tutoring can do more than help recover learning loss. It can build human connections. —Christian Science Monitor
- Making the American Rescue Plan’s child tax credit permanent would create an opportunity for bipartisanship—and for parents to invest in their children’s education. —Michael Gerson
- Why Majority Leader Schumer and AFT President Randi Weingarten supported almost $3 billion in relief aid for private schools. —New York Times
- A suburban Ohio superintendent, Tim Weber, managed to reopen schools and keep peace between competing community interests. —Wall Street Journal
- An Illinois district made gifted programs accessible to students of color. Can others do the same? —Hechinger Report
- “Why learning pods might outlast the pandemic.” —New Yorker
- “Lessons from the pandemic that can improve leading and teaching.” —Education Week
- A USC survey helps explain the racial divide in attitudes about reopening schools. —Center on Reinventing Public Education
- Despite requests for guidance from local officials, New York’s state government keeps punting on school pandemic protocols. —New York Post
- Five ways that public schooling should change in the wake of the pandemic. —Nina Rees
For more than two decades, report cards have offered Ohioans an annual check on the quality of public schools. They have strived to ensure that schools maintain high expectations for all students, to provide parents with a clear signal when standards are not being met, and to identify high-performing schools whose practices are worth emulating.
The current iteration of the report card uses intuitive letter grades—a reporting system that most Ohioans are accustomed to and support—and includes a user-friendly overall rating that summarizes school performance. In recent years, state policymakers have added new markers of success, including high school students’ post-secondary readiness and young children’s ability to read fluently.
While the current model can be improved, an ill-advised piece of House legislation (House Bill 200) goes in the wrong direction. It would gut the state report card—rendering it utterly meaningless—and cloak what’s happening in local schools. In short, the proposal would take Ohio back to the dark ages when student outcomes didn’t matter and results could be swept under the rug.
What does the bill do and why is it so bad? The bill:
- Uses incoherent rating labels. House Bill 200 replaces A–F letter grades with a six-category descriptive system for each report card measure. From highest to lowest, the labels are as follows: significantly exceeds expectations, exceeds expectations, meets expectations, making substantial progress toward expectations, making moderate progress toward expectations, and in need of support. These vague, jumbled labels will mean little to Ohio parents and citizens. What exactly does it mean to “exceed expectations” or “make progress” toward them? Can you guess to which letter grades those terms might correspond to? Whose expectations are they anyways? Worse yet, some of the labels will be downright misleading. There will be schools deemed to be “making substantial progress toward expectations”—equivalent to a D (or C?) rating—whose performance declines in a certain report card component (e.g., their achievement levels fall from the prior year). Lastly, the “in need of support” label is a euphemism that doesn’t raise red flags for parents or communities.
- Eliminates the user-friendly overall rating. The House bill scraps the overall rating that combines the various dimensions of the report card into a bottom-line assessment of school quality. It’s akin to a final GPA that sums up student performance across various subjects or to a credit rating that communicates risk to potential lenders. Dropping the overall rating, however, deprives parents of a prominent, user-friendly summary. Even worse, the lack of an overall rating allows for “cherry picking” one or two component ratings that support a specific narrative—either good or bad. An overall rating prevents such efforts by evening out the strengths and weakness of a school and offers a more comprehensive picture of school quality.
- Dumps a crucial measure of post-secondary readiness. First appearing as a graded component in 2015, the prepared for success dimension of the report card offers insight into students’ readiness for higher education and rewarding careers. Readiness indicators include remediation-free scores on college entrance exams, industry credentials, and passing scores on AP or IB exams. Importantly, such indicators go above and beyond basic high school graduation requirements and state tests, thus offering a look at whether students are meeting bona fide readiness targets. Given the ongoing need to prepare more young people for higher education and technical careers, it’s odd to see House lawmakers seeking to scrap this report card component. While the design of the component could use some improvements and a fairer grading scale, it remains a vital piece of today’s report card because it encourages schools to challenge students to meet more than bare minimum standards.
- Suppresses the performance of disadvantaged students. The state’s gap closing component ensures that less advantaged students—i.e., those who are economically disadvantaged, in special education, and/or English learners—don’t fall through the cracks. House Bill 200 at least keeps the component, but makes two changes that would soften accountability for these vulnerable student groups. First, it continues a misguided either-or approach to achievement and growth. Whichever measure—either a subgroup’s performance index or value-added score—yields a higher rating is the one that applies. For example, a school’s economically disadvantaged students might post satisfactory value-added scores but still fail to make meaningful progress toward proficiency. Under HB 200, the latter would be completely ignored in favor of the higher score.[1] Second, HB 200 increases the minimum “n-size” that schools must have in a subgroup before they are held accountable for their results in gap closing. Under current policy, which was developed through the state’s ESSA planning process, schools must have at least fifteen students in a subgroup. HB 200 raises that to twenty. As a result of this change, thousands of needy students will vanish from the subgroup accountability system.
- Weakens accountability for early literacy. Research has shown that children struggling to read are more likely to face academic difficulties later in life. Over the past decade, Ohio policymakers have enacted a number of policies aimed at boosting early literacy, including the creation of a K–3 reading report card component. Unfortunately, HB 200 would weaken this measure, as well. First, it would rely on promotional rates used for the Third Grade Reading Guarantee. The dirty little secret about these rates is that they exclude students who aren’t subject to retention, include promotions via alternative non-state testing options, and are based on a lower bar than the proficiency standard. As a result of these policies, the statewide promotional rate was a whopping 95 percent in 2018–19 even though the third-grade ELA proficiency rate was 67 percent. Talk about hiding the ball. Also disturbing is HB 200’s proposal to exclude mobile students from promotional rates; only students who have been in the same district or school in Kindergarten through third grade would count. Do mobile students simply not matter? Taken together, these policy decisions will produce inflated literacy ratings, ease the pressure on schools to help children read, and sweep any deficiencies under the rug.
Report cards are a balancing act. They must be fair to schools, offering an accurate and evenhanded assessment of academic performance. But they must also be fair to students—who deserve a report card system that challenges schools to meet their needs—and to parents and citizens who deserve clear and honest information about school quality. Ultimately, HB 200 veers too far in trying to meet the demands of school systems, which have an interest in a report card that churns out crowd-pleasing results—one that only captures “the great work being done in Ohio’s schools,” as the head of the state superintendents association recently put it. Such a system might avoid controversy, but it’s also a one-sided picture that ignores the interests of students, families, and taxpayers. As state lawmakers consider HB 200, they need to remember that report cards should be an honest assessment of performance, not simply a cheerleading exercise.
[1] HB 200 also takes a wrong either-or approach to value-added growth scores within the Progress component. The bill requires the use of either the three-year average score or the most current year score, whichever is higher.
- As we noted here in the Bites last Friday, there is a legislative effort clanking to life which would, if successful, pretty well gut Ohio’s school report cards. Fordham’s Aaron Churchill is quoted among the voices in the Dayton Daily News pointing out the many flaws. To wit: “This report card (proposal) is great if you want to make schools look good, and in a sense paper over challenges students are having.” (Dayton Daily News, 3/16/21) And indeed that seems to be the point. HB 200 got its first hearing yesterday in the House Primary and Secondary Education Committee, which means the bill’s sponsors got to state their case. Gongwer’s coverage of the hearing hits several
lowhigh points (including the assertion that somehow report card letter grades have been “the root cancer” leading to the deterioration of many Ohio schools) but unfortunately missed out on a key argument from the sponsors’ written testimony: “We believe that it should automatically be assumed that our schools are good enough that if an Ohio student receives a diploma, he is prepared for success.” I mean, if that’s what you believe…. (Gongwer Ohio, 3/16/21) Meanwhile, Fordham’s Chad Aldis was himself testifying in the Senate Primary and Secondary Education Committee on HB 67. That’s the bill that was going to eliminate state testing this year (until it couldn’t) but now seeks to further soften graduation requirements (if it can) due to the pandemic. (Gongwer Ohio, 3/16/21)
- Cincinnati City Schools this week reported a nearly four percent drop in enrollment from last school year to this one. What’s interesting is that, unlike some other districts which have raised similar alarms, Cincinnati seems pretty clear on where nearly every one of those 4,825-odd kids (!) has gone. And while 900 or so are pre-K and K students who might conceivably return after a redshirt year, the other 4K could just as conceivably be quite happy to stay where they are next year. (WVXU-FM, Cincinnati, 3/16/21)
- Cleveland Metropolitan School District is mentioned in this detailed national piece on schools’ Covid responses, under a section header titled “silver linings”, which looks at lessons learned for the future. While CMSD officials may have provided some evidence that they will attempt to improve teaching and learning for students next year (at the earliest), I am reasonably certain that the available evidence shows CMSD exemplifying not that subhead but the main headline of the piece: “Schools Squandered Virtual Learning”. (EdNext, 3/16/21) Meanwhile, a math teacher in bougie Beachwood City Schools says that she is three to four weeks behind in her teaching. Thus, state testing should be cancelled to give her time to catch up. Yep. I’m sure that’ll do it. (WOSU-FM, Columbus, 3/17/21) Meanwhile, Dayton City Schools gave us a look at their similarly-blinkered 2021-22 school year calendar. It is called a “back-to-normal” schedule—in that it is the same length and has the same start/stop dates as a traditional year—but it seems like it could also be called an “amnesia calendar”. No additional summer school or academic camps are planned beyond the usual. Like nothing ever happened. (Dayton Daily News, 3/17/21) Additionally, Dayton City Schools seems to have decided that the proper response to a failed school turnaround effort (they cancelled the remainder of a prior $425,000 contract back in April due to “lack of return on investment”) is to throw more money at the problem—indeed to septuple down on the plan. The elected school board signed a contract worth $3.2 million to a different third party entity to help boost test scores in six district buildings which have registered “pretty significant numbers of F’s” over the years. The full contract is for three years, but it has at least one get-out clause along the way. Phew! (Dayton Daily News, 3/17/21)
- Meanwhile, in this rather long retrospective (prom) of the harm supposedly wrought upon Toledo area students (prom!) during the Covid year (prom!!) seems inordinately focused (prom!!!) on non-academic stuff (prom!!!!). Especially for kids in Toledo City School District. Did I mention that the district’s prom appears to be back on this year? (Toledo Blade, 3/16/21) In other Toledo-area education news (choice), the Northwest Ohio Scholarship Fund (choice!) is holding its annual fund-raiser (choice!!) to help boost the number of private school scholarships (choice!!!) it can give out to eligible and interested families (choice!!!!). Can’t think why school choice might be on folks’ minds these days. Can you? (13ABC News, Toledo, 3/16/21)
Did you know you can have every edition of Gadfly Bites sent directly to your Inbox? Subscribe by clicking here.
NOTE: On March 16, 2021, the Ohio Senate’s Primary and Secondary Education Committee heard testimony on HB 67, a bill which would, among other provisions, make changes to the state’s graduation requirements in response to the coronavirus pandemic. Fordham’s Vice President for Ohio Policy provided interested party testimony on the bill. These are his written remarks.
Thank you, Chair Brenner, Vice Chair Blessing, Ranking Member Fedor, and Senate Primary and Secondary Education Committee members for giving me the opportunity today to provide interested party testimony on House Bill 67.
My name is Chad Aldis, and I am the Vice President for Ohio Policy at the Thomas B. Fordham Institute. The Fordham Institute is an education-focused nonprofit that conducts research, analysis, and policy advocacy with offices in Columbus, Dayton, and Washington, D.C.
The last year has been incredibly challenging, as we’ve learned to live with Covid-19 and its many repercussions. As you know, education has been one of those areas impacted. Last March, students and teachers across the state were thrust into remote learning. Although a significant number of schools have reopened for in-person learning, the extended use of remote and hybrid models greatly reduced the number of student-teacher interactions.
House Bill 67 started off as an effort to suspend state assessments this school year. Its sponsors were forced to pivot when the Biden administration—echoing an earlier pronouncement from the Trump administration—announced that the U.S. Department of Education would not be granting waivers from the annual testing requirement of the Every Student Succeeds Act (ESSA). We support this decision and believe it’s more important than ever to know precisely where students are and to ensure that resources are directed to those communities most negatively impacted by the pandemic.
HB 67, as it now stands, seeks to minimize any potential negative impacts as a result of state assessments this year and to address a few logistical issues to make the testing process a little smoother. We support the move to extend the testing window to later in the school year and to push the timeframe for reporting results back a month. These are smart, commonsense adjustments.
Things get trickier as it relates to graduation requirements. We support the language in HB 67 providing additional flexibility to this year’s junior and senior classes. The classes of 2021 and 2022 are still by default subject to Ohio’s previous graduation standards which asked students to earn 18 points on a series of seven end of course (EOC) exams. Given those stringent requirements, extending the flexibility granted last year for these students is prudent.
However, HB 67 goes too far in allowing course grades to count for EOC credit for sophomores and younger. The graduation requirements for those classes were just modified by this body in 2019. Students in those classes are only required to pass two EOC exams—Algebra I and English II. To be clear, passing doesn’t even require a proficient score but only achieving “competency” which is in the “basic” range on the state assessment. If this year’s students struggled in these core classes, it’s important that they receive the extra supports they need to be successful and improve their performance. These courses are important markers for college and career success post high school and shouldn’t simply be waived.
Finally, while we aren’t opposed to eliminating the U.S. History EOC exam this year, we’d urge you to give the issue careful consideration. First, the U.S. History EOC exam can only help students. If they do well on it and their government EOC, they earn a citizenship seal—part of the new graduation requirements for the class of 2023 and beyond. There are no penalties or negative repercussions if a student performs poorly. Second, having an EOC exam on U.S. History is a statement of intent. It indicates that the state has prioritized the subject and thinks it’s important. These days an argument could be made that we need more emphasis on U.S. History—not less.
Thank you for the opportunity to provide testimony.
NOTE: The Thomas B. Fordham Institute occasionally publishes guest commentaries on its blogs. The views expressed by guest authors do not necessarily reflect those of Fordham.
As Ohio policymakers continue to debate House Bill 1, legislation to reform the way the state funds schools, it is important that they continue to work to ensure the bill achieves its stated intent: to create a fairer and more equitable funding model for all of Ohio’s students. There is no doubt this is a gigantic undertaking. Unfortunately, as currently written, this legislation falls short of the mark in terms of funding Ohio’s seven public independent STEM schools.
The value and importance of STEM education cannot be understated. Members of the 127th General Assembly had the foresight to include a provision in the biennium budget that authorized the creation of public independent STEM schools, tasking them with developing and using innovative and transformative instructional methods with an emphasis on Science, Technology, Engineering, and Math (STEM). The schools were created, in part, to be research and development labs for education in Ohio, and to be incubators of STEM talent across the state to meet Ohio’s evolving workforce demand. Independent STEM schools have risen to the challenge.
Since that time, technology has become even more engrained in our everyday lives. Ohio’s public independent STEM schools actively engage students in the learning process and teach students how to learn. Each classroom is essentially a learning lab.
As the name implies, independent STEM schools operate outside of school districts. This allows the schools to not only be more agile with the programs they offer, but also permits them to accept students from a larger geographical region. While each school is unique, they all collaborate with local businesses, governments, and academic institutes to ensure the curriculum is meeting the needs of today’s workforce. Lessons at these schools focus on skill mastery and problem-based learning, again to actively engage students in their education. Additionally, independent STEM schools expose students to the career possibilities and prepare students for the industries of the future.
The successes of these schools are evident in many ways. Ohio’s independent STEM schools post a cumulative 100 percent graduation rate and have high rates of students who pursue STEM fields post-graduation. Demand far exceeds our capacity. Educators, businesses, and elected officials across Ohio and the country are interested in learning how to replicate our programs. Students, parents, and alumni testify to the difference their schools have made. Moreover, Ohio’s independent STEM schools take an innovative approach to education, and their successes don’t just benefit the students and families that attend them. We’re always happy to share best practices and provide training to educators across the state and nation. The ultimate goal is to see all students succeed.
While providing quality and innovative STEM education is still important to preparing students for the emerging workforce, independent STEM schools need equitable and sustainable funding to build on these successes. Unfortunately, House Bill 1, as written, does not accomplish that for Ohio’s independent STEM schools. Since the inception of independent STEM schools, they have had the lowest per-pupil foundation funding of any educational model, including traditional public, career-technical, and charter schools. In fiscal year 2019, the average per-pupil expenditure of $12,473 in traditional public schools was dramatically higher than the average expenditure of only $7,927 per student for independent STEM schools. Contrary to the goal of creating equitable funding for all students, House Bill 1 establishes a framework for school funding that would continue to promote inadequate funding for independent STEM schools. In fact, HB 1 proposes to fund public charter schools and independent STEM schools at 90 percent of their traditional school counterparts. This does not create equity, but continues to underfund innovative, high-quality, high-demand public school choice. Under the proposed funding model, independent STEM schools are projected to see a slight increase. But by building a reduced funding percentage into the framework, independent public STEM schools will never be fully funded, and will likely see the gap continue to widen.
Furthermore, independent STEM schools, by designation, emphasize personalized learning and real-life experiences and exposure to the workforce and emerging careers. Schools provide pathways in a variety of innovative fields like engineering, health, agriculture, and more. Each classroom is essentially a lab, and our teachers have deep knowledge and experience in these areas. The unique mission and fundamental requirements of independent STEM schools should be funded differently. The proposed funding increase in this bill falls short of the additional funding independent STEM schools need to support their robust career-exploratory programs designed to inspire and prepare future STEM professionals.
Over the years, Ohio’s independent STEM schools have used every process at their disposal to tap existing funds and reduce costs. They have leveraged community partnerships and built the curricula and programs to qualify for weighted funding through career-technical education. This has all been done while protecting and preserving the educational model. Regardless of how fiscally conservative they are, independent STEM schools are facing fiscal cliffs, and both current funding and that proposed in House Bill 1 are inadequate to allow them to continue to fulfill their mission as education incubators.
As the General Assembly continues to debate HB 1, members should ensure an equitable and sustainable funding model for independent STEM schools. This will allow these schools to continue fulfilling the mission they were tasked with in 2007. Innovation will be key to rebuilding Ohio’s economy after Covid-19. Independent STEM schools will be an integral part of that solution.
Meka N. Pace is the president of the Ohio Alliance of Independent STEM Schools.
Concerns over the increased potential for cheating are front and center in debates over testing while students are learning remotely. A new report from a group of researchers at Rensselaer Polytechnic Institute (RPI) in New York details their cheat-resistant online exam protocol, an innovation that could fill an immediate need and pave the way for the future of testing.
Methods of online proctoring exist, but they are often expensive, riddled with privacy concerns, and draconian, forcing students to, for example, keep microphones on and remaining in frame for an hour. They also signal to students, perhaps unintentionally, that adults don’t trust their honesty. Text-recognition software can discreetly detect plagiarism, but it’s useless on multiple-choice or calculation questions and younger students’ written work. Using a huge bank of test items to randomly deliver different questions to different students could also limit remote cheating opportunities, but it requires an extraordinary amount of work for educators and goes against best educational practices.
The RPI team sought to address the drawbacks of each of these models by creating a simple, cost-effective, and privacy-conserving solution that would help educators administer a valid remote assessment with minimal efforts. The key component of their model, called a distanced online test (DOT), is timing. Rather than having all students start the DOT at the same time, the test is broken down into sections which are given to different groups of students at various times. Those at the lowest-mastery levels of the content—as determined by midterm scores, current GPAs, SAT scores, or other class grades received prior to the DOT—start the test first. Once that lowest-mastery group has completed the first section, they move on to the next—with no option to return to previous sections—while the next-highest-mastery group starts the first section. And so on.
Without live proctoring, the main cheating concerns are internet searches for answers and collusion with others. Previous research into online testing found that nearly 80 percent of cheating events occurred via collusion, 42 percent via copying from the internet, and 21 percent falling into both categories. Statistical evidence suggested that collusion would be strongly suppressed by the DOT model’s staggered start. With no ability to return to closed sections, students wishing to collude would have to do so in real time. But the higher-mastery students—from whom help would most likely be solicited—would not be working on the same set of questions. Internet copying, meanwhile, could be addressed through question construction and a slightly larger question pool.
The main benefit promised by the DOT method was simplicity. No additional equipment required, no random question generators needed, and no violations of student privacy. While more test questions were required in the optimal DOT method—so that the test sections received by each cohort would not be exactly the same in content—the RPI team determined that a maximum pool just 1.5 times larger than the number of total test items would do the trick, especially if questions were mainly of a type that “require intellectual efforts [rather] than factual recalls.”
The RPI team honed their model and then tested it as a fully-remote, non-proctored final exam in a class where the midterm exam had been given fully in person earlier in the semester. Seventy-eight students took both exams, which each consisted of forty graded items. All were multiple choice questions. The DOT final was broken into two sections of twenty questions each. There were two mastery cohorts for the DOT final, and thus two starting times. The results of the midterm served as a control to which the DOT exam results were compared. Both exams produced the typical bell-shaped curve of a normal distribution of scores, and analysis showed that both distributions had the same mean value, demonstrating consistent evaluative results of the same population via the two very different types of testing methods. Their analysis also found random patterns of incorrect answer matches between any given pair of students—a traditional means of testing for evidence of cheating—and an approximately equal distribution of correct answers between the two test sections. This latter test was DOT-specific and attempted to account for the fact that more collusion was to be expected in one half of the exam versus the other. Given these findings, the RPI team determined that the DOT reduced the possible point gain due to collusion to less than 0.09 percent. Post-exam surveys indicated general approval of the DOT structure by students, reasonable assessment of question difficulty, and positive disposition toward ease of use.
The RPI team concluded that their DOT model not only met the criteria of an easy, cost-effective, non-proctored remote testing platform, but also that, when students knew they could not collaborate with others, they were more motivated to actually study the material to achieve correct answers themselves. These are all important aspects of good online exams, but it cannot be overlooked that the tested version of DOT was developed for college students, where the possibility of expulsion for cheating is a real concern, and that it included only multiple-choice questions and covered just two mastery cohorts. Whether this approach to online testing will work at scale in K–12 education is hard to know. But RPI’s simple strategy to stagger testing times might just be a way to lessen the potential for cheating.
SOURCE: Mengzhou Li, et. al., “Optimized collusion prevention for online exams during social distancing,” npj Science of Learning (March 2021).
- Bougie Wyoming City Schools in suburban Cincinnati hit the big time with this lengthy feature in the Wall Street Journal. Its stated intent is to show a savvy supe who has been able to navigate what might be politely termed as people’s personal ish and to successfully run a district that has been mostly open for full-time instruction since August. Sadly, there’s way too much focus on the ish for my stomach to take. Although I did like the photo of the high schooler with her own portable partition. That sounds like an interesting innovation. (Wall Street Journal, 3/14/21)
- Meanwhile, one less-than-bougie district and two decidedly un-bougie local charter schools in the Canton area are just now returning students to their classrooms after a year of fully-remote learning. (Canton Repository, 3/15/21)
- Metro Toledo’s public transit agency, TARTA, is ending its contract with Toledo City Schools to transport local students to charter, private, and district schools. They cite
pending legislation“regulatory and compliance issues” for the change. The district now has until August (maybe not that long if certain legislation passes before then) to figure out an alternative. The plan under discussion at the moment involves transporting “mostly the private and charter school kids” via a third party private entity (which may be based out-of-state) called Trinity Transportation while the district will transport “mostly” its own kids while also adding “bus driver” to its list of high school career pathways. I can’t think of a single that could go wrong here, can you? (13 ABC News, Toledo, 3/12/21)
- Speaking of third party providers, here’s a look at one central Ohio school district’s proposed recovery/remediation plan currently being debated by its administration and elected school board. Anyone else feeling underwhelmed by this besides me? I was under the impression, having read all of the things over the past twelve months, that asynchronous remote learning (with a once-per-week Zoom office hour chaser) was the problem. Not the solution. (ThisWeek News, 3/11/21)
Did you know you can have every edition of Gadfly Bites sent directly to your Inbox? Subscribe by clicking here.
A long year
KIPP Columbus leader Hannah Powell is among the educator voices heard in this piece looking at all that has happened in the year since Ohio schools were ordered closed due to the growing pandemic. KIPP has only recently moved from a fully-remote to a hybrid education model, but more than half of school families opted to remain fully-remote.
Transportation fix?
As more Ohio school districts return students to in-class learning, especially in urban areas, old and intractable difficulties with transporting students attending schools of choice are resurfacing. The Fordham Institute’s Jessica Poiner recently took a look at transportation provisions in the state budget bill that could, if enacted, go a long way toward resolving these longstanding barriers to school choice.
Enrollment boost
Data from Clark and Champaign County public schools show that two charters bucked the trend of declining enrollment over the last year. The leader of one of those schools, Cliff Park High School in Springfield said that most of what they have been doing to support and attract students is not new. “The staff has worked very hard to create a safe and welcoming environment that draws in students,” he told the Springfield News-Sun, “and causes them to talk to friends and family.”
The same work in a new way
Students at Toledo School for the Arts are still as creative and hard-working as ever, even though disruptions wrought by the pandemic have required them to find new ways to present their work. Here’s a great look at how they made the mantra that “the show must go on” a reality this year.
Positive pandemic pivot
Here is a great look at how Miami University’s teacher training program forged a partnership with Ohio Connections Academy last year to allow its seniors to do their student teaching in the fully-online school so they would not lose an hour of that precious training time with students. An inspiring model in a number of ways.
Fellowship
Learn to Earn Dayton’s Anytown fellowship program aims to help high school students find their voice and to speak up and advocate for change to fight such systemic ills as racism, sexism, and classism as students have experienced them. The program is being piloted at DECA High School and one other school this year.
- Today marks one year since Governor DeWine announced schools would close in response to the oncoming coronavirus pandemic and lots of media outlets are jumping on the “one year later” bandwagon with some doom, a whole lot of gloom, a shot at lessons learned, and a ton of enlightening anecdotes. Perhaps it is the soreness in my arm and some mild brain fog talking (you know what I mean), but I personally am not finding this “education has changed so much in a year” narrative all that convincing at the moment. (Columbus Dispatch, 3/12/21)
- I ask you: what really has changed since the first school closures were announced as “an extended spring break” last March? As the piece above indicates to me, very little is different: Some kids have access to a much better education than others, just like it was back in pre-rona times. In this one, a national piece with a Columbus anecdote, I was reminded that some kids have access to gifted services, and some don’t. Just like before. (AP, Via ABC News, 3/12/21)
- School choice is still seen as evil by many many folks. Maybe more evil than before? (The 74, 3/10/21)
- Some folks are still squawking about Academic Distress Commissions like angry chickens. (Chronicle-Telegram, 3/12/21)
- And no one is more surprised than me that we’re still talking about the Toledo Weed Guy and his college scholarships for Scott High School grads…let alone talking about it with such glowing positivity. (Toledo Blade, 3/10/21)
- But it does seem that there are a couple of actual changes worth at least a mention. For one, that true Covid-era innovation (of the decidedly downmarket variety) the whipsaw—with all its surefire parent-angering qualities—looks to be staying with us for a while even after the land is aflow with
milk and honeyMerck and Janssen. (NBC4i, Columbus, 3/11/21) - And while some version of the college enrollment crash was playing out long before anyone ever heard of SARS-CoV-2, a year’s worth of the pandemic polka sure seems to have helped make the case that the state should bail out higher ed right now. (ABC6 News, Columbus, 3/11/21)
Did you know you can have every edition of Gadfly Bites sent directly to your Inbox? Subscribe by clicking here.
The Fordham Institute has published a two-part piece by Checker Finn on giving “power to the people,” as well as a response from M. Karega Rausch, President and CEO of the National Association of Charter School Authorizers. In Checker’s essays, he discusses “today’s push for ‘community control’ on the part of many education reformers and philanthropists.” He expresses some misgivings about this approach. His second piece in particular focused on what lessons could be learned for K–12 reformers and donors from two past community “empowerment ventures”—the War on Poverty and its community action program, and New York City’s late-1960s move to decentralize the control of its public schools, particularly in Ocean Hill-Brownsville. I want to offer some further details and reflections on the War on Poverty’s community action program and raise some questions that need to be answered when considering today’s push by reformers and donors in community empowerment efforts.
A chief part of Lyndon Johnson’s War on Poverty was the Economic Opportunity Act of 1964. It created Community Action Agencies (CAA) with anti-poverty programs that required the “maximum feasible participation” of local residents. In 1969, Daniel Patrick Moynihan (later a U.S. Senator from New York) wrote a book on how these agencies worked, entitled Maximum Feasible Misunderstanding.
Moynihan didn’t think they were a total failure. But he pointed to their architects’ many differing views of what CAA should do, explaining that the law creating CAA incorporated all those views, leading to maximum feasible misunderstanding.
His caution is worth remembering as one attempts to implement community-led empowerment approaches to local change—another way to describe community action. In a 1966 article prior to the book, Moynihan explained the problem with CAA (emphasis added):
[T]he enterprise has been flawed from the beginning by radical differences of understanding within the federal government as to what the program is all about.... Washington gave...cities...an assignment they could not possibly carry out. [This] goes to the definition, meaning, and intent of the Program. What are they supposed to do? Are they to make trouble—or prevent trouble? Create small controversies in order to avoid large conflicts—or engender as much conflict as they can? Hire the poor, involve the poor, or be dominated by the poor? Improve race relations or enhance racial pride? What is it that Washington wanted?
He suggested there were at least the four different definitions of community action at work, which I summarize in the following table.
Model/approach |
Guiding principles |
Program purpose |
1. Budget bureau |
Efficiency and coordination |
Programs should help individuals solve their problems. How, for example, do they relate to each other? Do they save resources? What’s their impact/ROI? |
2. Community organizing (e.g., the Alinsky model) |
Conflict and disruption |
Programs should help individuals acquire power. This occurs when communities mobilize and organize themselves to induce conflict with those who have power. |
3. Peace Corps |
Provide services |
Programs should help individuals develop their capacities and capabilities so they improve their lives. They need an outside assist to begin that work. |
4. Task force |
Political effectiveness |
Programs should help different individuals/constituencies—e.g., politicians, the “needy,” other stakeholders—solve their problems by making sure each gets “part of the pie.” |
Moynihan called the CAA approach “pluralism run riot” and suggested the major lesson to be learned:
If there is a lesson here, it is somewhat as follows: government [editorial comment: you can substitute the word philanthropy] intervention in social processes is risky, uncertain—and necessary. It requires enthusiasm, but also intellect, and above all it needs an appreciation of how difficult it is to changes things and people. Persons responsible for such programs who do not insist on clarity and candor in the definition of objectives, and the means for obtaining them...do not much serve the public interest.
So serving the public interest requires clarity of purpose: What end is one aiming to achieve? Or in today’s parlance: What does success look like? And then about means to that end: What approach(es) is/are best suited to reaching success?
So the moral of the story for reformers and donors alike should be to remember Murphy’s Law and its several variations. According to Murphy (who is thought to be British mathematician Augustus De Morgan): Whatever can go wrong, will go wrong. From Peter Drucker: If one thing goes wrong, everything else will, and at the same time. And finally, Arthur Bloch wrote a little book called Murphy’s Law and Other Reasons Why Things Go Wrong. One of his maxims: If there is the possibility of several things going wrong, the option that will cause the most damage will be the one to go wrong.
Here’s how Checker put it:
The surest takeaway from this brief history lesson is the reminder that the law of unintended consequences has never been repealed. And nothing within the power of even the most ardent reformer or wealthiest donor can repeal it.
Amen.