An unhappy episode in Montgomery County, Maryland, (where I live) reminds us that the quest for accountability options other than standardized assessments can open the door to new forms of chicanery.
First, a bit of background. Maryland’s high school graduation requirements include statewide end-of-course exams in science, algebra, English, and government. Students may substitute passing scores on AP or IB exams in several of those subjects, but it’s accurate to say that the Old Line State has stuck with mandatory EOCs even as a number of jurisdictions have backed away from them. Still, there are kids who have trouble passing those exams, and Maryland has long offered a work-around known as the Bridge Plan. Students who twice fail the EOC in a subject can undertake an individual project in that subject, and if they successfully complete it, their exam failure won’t preclude them from graduating. Ten or 11 percent of Maryland diplomas are typically achieved with the help of the Bridge Plan, and in some parts of the state it has become a major highway to the graduation stage: Almost a quarter of the diplomas in Prince George’s County and close to two-fifths of those in the city of Baltimore are Bridge-dependent. Which is to say, sizable fractions of the girls and boys in those jurisdictions would not be graduating from high school if they were actually required to pass the state EOCs.
The Bridge Plan has been controversial for years now and was much debated—and deplored—during my time on the State Board of Education and the Kirwan Commission, the obvious issue being whether kids who graduate with its help are, in effect, getting diplomas they don’t truly deserve because they haven’t actually met the state’s none-too-demanding academic standards in core subjects. Legislators are currently weighing the Kirwan recommendations, and if those get adopted in full, the Bridge Plan will eventually be history, though much finetuning will need to be done by state education officials before new standards and metrics for “college and career readiness” are set for the long haul, and there is certain be continued pressure to allow some sort of workaround.
Meanwhile, however, myriad incentives are at work to ensure that as many kids as possible get their diplomas, and Maryland’s twenty-four local jurisdictions currently have the Bridge Plan available to assist with that quest.
The state has taken steps to try to build rigor and reliability into the ways that localities deploy the Bridge Plan, but at day’s end it’s the educators in one’s school district—and usually in one’s own school—that evaluate a student’s project to determine whether he or she deserves credit. (The state is unable or unwilling to disclose data on how many of those who actually complete their project “fail” it.) Because of pressure to get kids across the graduation stage—and to see that the high school’s graduation rate looks good for ESSA purposes—there is ample reason to suspect that the criteria and rubrics by which Bridge projects get evaluated are rather elastic.
The recent episode in Montgomery County, however, points to something worse than elasticity. There, a feisty, veteran whistle-blowing high school social studies teacher named Brian Donlon reported that he saw students planning their Bridge projects in government with the help of a worksheet that someone had filled out for them in advance. He viewed it as cheating. “If a student was taking an actual test and gave the student some guidance toward the answer, I think this is exactly the same thing,” he told a Washington Post reporter. “It’s a highly improper level of assistance.” He reported his concern to district officials, but in the absence of any evidence that they were doing anything about it, he contacted the state. State officials looked into the matter in what may have been a cursory way and reported back to Donlon that the district had adequately handled the matter. The district insists that only one pupil—an ELL student—was helped, and that the help given was within bounds. Donlon disagreed and wrote back to the state, saying this wasn’t an isolated instance but something more like a pattern. He also testified before the State Board, noting that the state’s own guidelines say that Bridge monitors “should not complete any portion of a student’s project” and yet declaring that he had seen this done in his school.
How the particular episode gets resolved remains to be seen, but the issue it poses is self-evident, worrying, and I think unavoidable: Educators under pressure to get diplomas to kids and to boost their schools’ and districts’ graduation rates are obviously tempted to help in every way they can. But at some point “helping” crosses a blurry line and becomes “cheating.”
The temptation is obvious in exam-based accountability systems where we can point to documented instances of school officials leaking test questions, changing test responses, and in other ways falsifying information. But exam-based systems have well-established protocols and security provisions that are nearly always implemented. Test booklets are kept under lock and key (or online test questions appear on the screen only when the correct test taker is sitting there). Student identifications are validated. Cellphones and cheat sheets are barred from the testing room. Proctors hover. The completed test books are gathered up and locked away or sealed and shipped for scoring—or the online equivalent is similarly secured.
None of those measures is foolproof, and clever adults with powerful incentives to manipulate the results will continue to seek ways by which to do so. Yet the security protocols are well known and widely adhered to. (Moreover, those doing the scoring have ways of detecting cheating, and periodically one reads of scores being invalidated and exams retaken because something was fishy the first time around.)
When we seek alternatives to the proctored and monitored exam form of high-stakes accountability, however, the challenges multiply. Nearly always, those alternatives—whether classroom work, teacher-administered exams, student projects, performances, portfolios, you name it—are judged subjectively, almost always by adults who know the kids’ identities and academic track records, and most of the time by adults who also have reasons to seek student success, whether it’s because they care about a kid passing and graduating or they’re being hassled by parents or principal or they know that the school’s passing or graduation rate is on the line. And how on earth can anyone actually monitor the criteria by which that alternative work gets judged or what kinds of assistance are given to students while they do it? (Parents and paid outside tutors and counselors may also get involved in “helping,” not just educators. Remember “Varsity Blues”?) Few intentionally cheat—but there’s no foolproof way effectively to monitor where the lines get drawn by whom.
Sure, there are ways to ameliorate the risk. Teachers from other schools can evaluate kids’ projects or portfolios. Student names can be “blinded” from evaluators. External reviewers can double-check the work or at least audit a sample of it. At day’s end, however, we’re still wrestling with the same challenges of incentives, familiarity, and “wanting to help.” Nobody likes to see kids struggle, flounder, or fail.
I, too, worry about over-testing and placing excessive pressure on test scores. I understand that, even under the best of circumstances, tests—including the very best kinds, such as AP and IB have developed—can only yield limited information about what students have learned and what they know and can do. I understand that there’s much more we want to know about kids and schools than test scores can tell us. But I also understand this: Accountability systems are meaningless (or worse) if they lack integrity, and when we rest them on subjective judgments by adults who face many pressures and incentives, we heighten the risk of mischief. A secure, well-proctored assessment may not yield an optimal accountability system, but at least the data will have integrity. Project-based alternatives such as Maryland’s Bridge plan contain far greater risks of finagling.