Scores and Rumors of Scores

Michael Regnier

“It ain’t what people don’t know that hurts them. It’s what they know that ain’t so.” Whoever said that didn’t know the charter school debate.* Especially after the release of
state test scores
, we actually have both problems.

Consider the sticky issue of student enrollment patterns. In order to draw conclusions about a school’s impact on achievement, we’d like to have a clear picture of the students it serves. What are their needs when they arrive? How do they fare over time? Which students leave the school over time, and which, if any, arrive?

Bloggers such as Owen Davis have felt free to draw conclusions from the basic data available. But as the Charter Center’s
white paper on school data
points out, some of the key information we need for this debate is not collected and publicized in any systematic way.

For example, we know charter schools are more likely to de-classify students from SpEd and more likely to move them from ELL status into English proficiency, but we can’t tell to what extent a student subgroup is shrinking because students are leaving or labels are coming off. We also know that many charter schools regularly retain students in-grade, but again we can’t easily distinguish between grade retention and student attrition.**

What we don’t know, in other words, is a lot. We need better data. But ignoring the limitations of what we have, as Davis does, is no help.

On the other hand, there are some things charter school critics know that just ain’t so. Blogger Gary Rubinstein is convinced that charter schools dropped more than other public schools on the state Math and English Language Arts (ELA) tests after New York shifted to the Common Core State Standards.

Rubinstein’s test score analysis doesn’t actually include all charter schools – we have them listed here (.xls) – and by “scores” he actually means proficiency rates. As his own sympathetic commenters point out, though, this is inappropriate; when the proficiency line has been re-drawn, it’s more informative to compare changes in actual scores.

To see how much charter and district schools’ scores dropped, we looked at the year-to-year differences in mean scale scores for every charter and district public school in New York City that tested in both 2012 and 2013. Since scale scores aren’t built to be compared across grade levels, this gave us 12 comparisons points (6 grade levels x 2 subjects).

The comparison table is below. Of the 12 subject/grade combinations, charter school drops were smaller in six cases, larger in four cases, and equivalent to district school drops in two cases. It’s basically a wash, but if anything charter school scores dropped less with the change in standards.

Interestingly, Rubinstein only highlights his comparison for seventh-grade math. Was he just saving time? Or cherry-picking one example to make charter schools look bad? I don’t know. But the “fact” that charter school test scores dropped more… ain’t so.

Grade Subject NYC Charter Schools, Difference in Mean Scale Scores, 2012 to 2013 NYC District Schools, Difference in Mean Scale Scores, 2012 to 2013 Difference (negative means smaller charter drop)
3 ELA -361 -364 -3
3 Math -381 -387 -6
4 ELA -374 -374 0
4 Math -389 -389 0
5 ELA -370 -369 1
5 Math -387 -388 -1
6 ELA -369 -366 3
6 Math -383 -382 1
7 ELA -365 -367 -2
7 Math -382 -379 3
8 ELA -359 -361 -2
8 Math -378 -380 -2

Year-to-year columns may not add to charter-district difference column due to rounding. District scale scores from CSD-level reports, weighted by test takers.

*Fittingly, this quote seems to have come from humorist
Josh Billings
but many people “know” it was Will Rogers or Mark Twain.

**For data sources and an extensive discussion of these issues, see our 2012 report on the
State of the NYC Charter School Sector
, starting on p. 24.