Oct 17, 2017
Sometimes an unintentional and innocent error helps us see a bigger problem. This year, due to a coding error, all of the students at Deal Middle School were classified as “economically disadvantaged.” Test scores are notoriously correlated with family income, with students from lower income homes scoring below those from more affluent homes. (The goal, of course, of education is to change that—but, at their best, schools bring about such changes over time, not in a year.)
So, when DC officials reported the test scores of the city’s disadvantaged students this summer and included all of Deal’s students in that category, the scores were, not surprisingly, higher than they would otherwise have been. Likewise, when the coding error was discovered and the scores corrected, the overall portion of disadvantaged DC students who had reached proficiency dropped. The Washington Postreported this on October 5th and the Office of the State Superintendent of Education (OSSE—DC’s state education agency) and DCPS have made corrections on their websites.
But here’s what’s most interesting to me. In a very large number of DC schools, all students, regardless of their income, are regularly and routinely classified as “economically disadvantaged”for the purposes of reporting test scores. In other words: What is a coding error at one school is the accepted, standard practice for other schools. That’s because if 30/40% or more of a school’s students are eligible for free or reduced lunch based on their family’s income, the entire school can opt to provide a free lunch to all students. This option, known as “community eligibility,” makes total sense; it allows schools to lower the administrative costs/bureaucracy/stigma of identifying and overseeing a system in which some kids in a school get a free or reduced lunch and others don’t. The problem is with the test score reporting; when these students are categorized as “economically disadvantaged,” it means reports about the scores are misleading. My SBOE colleagues, Joe Weedon and Jack Jacobson, explained this in this Greater Greater Washington post.
As I said to City Paper when asked about the coding error, the way we report scores “are misleading in multiple ways. When students who are not disadvantaged get coded as disadvantaged, as appears to be the case in many schools, the scores of disadvantaged students will seem higher than they genuinely are.”
This is one important way in which the reporting of test score data ends up being misleading. For other ways, see this Washington Post op-e d. that I wrote last year about the test score problems at Wilson High School.