Never trust an education success story

Education success stories are arrant garbage.

Jay Mathews prides himself on using data in his education stories. The only problem is, he usually doesn’t understand the data, or how to evaluate the data critically. Not that Jay is any different from any other reporter, and in fact is better than most–which, given that he’s the most influential education reporter in the country, is no small thing. (Disclosure: I know Jay slightly, and when he’s reporting a story, instead of data, he is very good at getting the details right. He’s also an amazingly nice guy.) So when Jay, or any other reporter, starts praising someone for closing the achievement gap, start with the premise that it’s complete crap.

Jay’s laudatory blog item on Robert G. Smith, head of Arlington County Schools, Stunningly reasonable achievement gap approach is a case in point.

As is often the case, Jay’s focus is puzzling. He’s not praising Smith for having dramatically closed the achievement gap in his district, even though Jay clearly thinks Smith has, in fact, closed it. (He hasn’t, but more on that in a minute). No, Jay is pleased with Smith for being “reasonable” about his insistence on closing the achievement gap, for only wanting improvement towards a goal instead of 100% proficiency, as the feds do.

Smith closed the achievement gap, but all Jay notices is how reasonable he is?

But of course, Smith didn’t really close the gap, or at least any gap worth caring about.

From 1998 to 2009, the portion of black students passing Virginia Standards of Learning tests in Arlington rose from 37 to 77 percent. For Hispanic students, the jump was from 47 to 84 percent. The gap between non-Hispanic white and black passing rates dropped from 45 percentage points to 19. Between Hispanics and non-Hispanic whites, the gap shrank from 35 points to 12.

Ah, passing rates. How many students passed a particular bar?

So if you give students a one-question test: 2 + 2 = ______, and everyone answers 4, then everyone’s passing rate is 100%. You’ve closed the achievement gap! Huzzah!

Or, suppose one year on some arbitrary test saw whites have a mean score of 85 points with a 95% passing rate, while blacks had a mean score of 63 with a 36% passing rate. The next year, whites have a mean score of 94 points, still with a 95% passing rate, while blacks have a mean score of 62, with a 58% passing rate. The passing rate closed. The average score gap got larger. Now, this is still big news if “passing” means, for example, clearing the high school graduation hurdle. But on a state test, you have to look closer.

So what does “passing” mean, in the Virginia Standards of Learning tests? Arlington County’s Report Card shows that students can be “Advanced”, “Proficient”, or Fail. And no mention of average scores.

A state test that only ranks students as Advanced or Proficient? A state test that declares 77% of blacks, 84% of Hispanics and 96% of whites in Arlington Country proficient–and they’re only a little bit ahead of the state average? Huzzah for Virginia! Why is everyone fussed about Finland when Virginia stands as such a shining example?

Or, we could just say the cut scores are a tad low. Let’s do that, instead.

I’m not a fan of NAEP’s tests; they place far too much emphasis on writing and frankly, I don’t trust their method of sampling. How hard is it to test everyone? A lot easier if they weren’t so determined to make the students write to excess. But it’s extremely useful as a benchmark, as it ranks the states by an absolute standard. Here are the rankings of state proficiency standards against the NAEP (I’m going to try wordpress’s slideshow here, see how it works):


Source: Mapping State Proficiency Standards Onto the NAEP Scales: Variation and Change in State Standards for Reading and Mathematics, 2005–2009

Virginia’s “proficient” standard is below basic on NAEP’s scale–and lower than all but a few states. In the cellar.

So Smith’s big achievement is increasing the pass rate on a state test with a very low cut score–and we have no idea whether the actual average score gap was increased or not.

That’s pretty much what to expect in your usual education miracle story. They’re all lies.

Now, I’m not the first, second, or thousandth person to have pointed this out–in fact, most bloggers get far more obsessive about the data then I do, creating excel spread sheets and creating whizbang images. The NAEP benchmark report is old news–and of course, reporters faithfully reported that state proficiency rates were extremely low. And then they all go right back to repeating the latest myth.

That’s the real question. Why are reporters, consultants, and politicians still spewing this swill, happily repeating useless fuzzy data when they know it’s a lie?

Advertisements

About educationrealist


One response to “Never trust an education success story

  • surfer

    I think the statistics stuff with restriction of range and the IQ arguments are not needed and maybe confusing to Jay. The “killer analysis” is when you show how the VA data had changed almost exactly the same as Arlington. Somewhat related is how strange it is that he does not describe what special program was done to get what he thought were differential results.

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s

%d bloggers like this: