I was surprised by how strongly Rick Kahlenberg attacked the new Ed Sector report on interdistrict public school choice, since the study really has nothing new to offer us other than some really neat maps. Instead of keeping things in proportion, we get Kahlenberg waxing poetic on the basic standards of "Social Science 101" and methodology nitty gritty in an excessively long winded diatribe. Of course, Dillon actually spends quite a bit of time explaining her choices (try page four, which Kahlenberg calls "the fine print in a sidebar"-it's actually an entire page-or the Appendix). I've extricated his two main points from the overblown rhetoric: Dillon makes two inappropriate assumptions, which cause her findings to be unnecessarily pessimistic, and she's giving fodder to choice's opponents.
Kahlenberg doesn't like Dillon's first assumption, a 20 minute driving time as her outside radius for finding high performing receiving schools, because it's too confining. I could spend time arguing with this but I'll let Dillon explain it herself (from page four, naturally): "We chose a 20 minute driving distance to represent the time most students spend commuting to school-according to data from the 2001 National Household Travel Survey, the average commute to school is 18 minutes". Oh look the data is from 2001, before gas prices skyrocketed; I'd say that Dillon may have overestimated how far parents are willing to drive at a $4.50/gallon. Districts all over the country are even cutting back on or eliminating routes because of fuel costs.
His second critique is even worse as he condemns Dillon's assumption that schools can increase their enrollment by 10%. His reasoning (as he explains to Education Week ) is nothing but childish: I don't know how we should cap school enrollment so Dillon and Ed Sector must not know either. On his blog, he elaborates with rhetorical gusto:
Social Science 101 suggests that when a variable - in this case, school capacity - is unknown, researchers don't simply assume the validity of an arbitrary figure.?? Instead, a careful social scientist would consider the known variable - empirical findings about the number of schools within a reasonable driving distance - forthrightly admit that the capacity variable is unknown, and then calculate the impact under various assumptions about increases in available space (e.g. 10% capacity, 20% capacity, 30% capacity etc.).
Yes, and why don't we crunch the numbers at 11%, 14.7% and 64.8998345% of capacity too? I would even call Dillon's estimation liberal, seeing as many schools have neither the capacity nor the willingness to expand capacity. But according to Kahlenberg's "Social Science 101", we should throw all studies that necessarily restrain themselves for the sake of cogent data analysis out the window. The truth is that the only reason he takes issue with Dillon's methodology is because the study doesn't find what he wants it find. At the end he whines, "it provides timid politicians with yet another excuse" to fight school choice programs. Dillon's study adds nothing to the choice debate besides some cool looking maps; a long winded whiny post trying to be some defender of statistical purity is not going to change that.