Value Added–None

So the NY Times breathlessly informs us of a new study that links good math and reading teachers in elementary and middle school to all sorts of improved effects on students’ lives. Just look at the data:

Teachers that raise test scores in these subjects and grades improve each student’s income by $750/year, make it 1% more likely that each one will attend college, and drop the girls’ pregnancy rate by about .6%.

Pause.

Okay. Doesn’t that make it seem as if there isn’t all that much difference between teachers who raise scores and teachers who lower scores?

Color me unimpressed.

I found many things to quibble about. First, the bit about teenage pregnancies:

We …first identify all women who claim a dependent when fi…ling their taxes at any point before the end of the sample in tax year 2010. …We refer to this outcome as having a “teenage birth,”but note that this outcome differs from a direct measure of teenage birth in three ways. First, it does not capture teenage births to individuals who never …le a tax return before 2010. Second, the mother must herself claim the child as a dependent at some point during the sample years. If the child is claimed as a dependent by the grandmother for all years of our sample, we would never identify the child.” (Cite from study, page 19)

In other words, they aren’t checking for teenage pregnancies, but teen mothers who managed to get it together enough to find a job and file tax returns. Yeah, that’s a big group. Because teenage moms are no more or less likely than other students to become productive workers who file tax returns. We shouldn’t wonder, perhaps, if teen moms are disproportionately found in the 10% of the population that had no tax returns and thus weren’t included in the study.

And then, the researchers seem in an awful hurry to fire teachers.

“The message is to fire people sooner rather than later,” Professor Friedman said.

Professor Chetty acknowledged, “Of course there are going to be mistakes — teachers who get fired who do not deserve to get fired.” But he said that using value-added scores would lead to fewer mistakes, not more.

But in fact, their study says nothing about what will happen if teachers who lower scores are fired. They, of all people, should know that.

This paragraph, from the study, just amuses me no end:

Consider a teacher whose true VA is 1 SD above the median who is contemplating leaving a school. With an annual discount rate of 5%, the parents of a classroom of average size should be willing to pool resources and pay this teacher approximately $130,000 ($4,600 per parent) in order to stay and teach their children during the next school year. In other words, families would earn an average annual rate of return of 5% if they invested $4,600 to give their child a teacher whose VA is 1 SD higher. Our empirical analysis of teacher entry and exit directly shows that retaining such a high-VA teacher would improve students’ achievement and long-term outcomes. Because the impacts of teachers are roughly proportional to income, high income households should be willing to pay more for better teachers. Parents with an annual household income of $100,000 should be willing to pay $8,000 per year for a teacher whose true VA is 1 SD higher.

Before I get to the amusing part (okay, it’s not really funny in a haha way), consider that the data comes from a large urban school district. 71% of the students are black or Hispanic, 76% are low income. So even if we assume no other problems with the assumptions and gaps, these findings apply primarily to low income black and Hispanic students and, to a lesser extent, middle and high income students who go to public school in a high-poverty urban district—not, I’m thinking, an extremely representative sample. We have no idea if we can apply these findings to suburban middle class kids, working class kids, or even poor white kids (who routinely outscore middle class blacks in many state tests, SAT, and most NAEP report cards). hHw many high income families are living in an urban school district that’s 76% low income? How representative would they be of high income families in the suburbs? I’m not convinced they can conclusively argue their results hold for all populations.

Really, though, this whole line of thinking just makes me laugh. Do the researchers really think that high income families are thinking of their own kids when these studies hit the news? Or are they just using this absurd investment analogy to reinforce how strongly they feel about their data?

In Achievement Gap Mania, the best education article of 2011, Rick Hess writes:

First, achievement-gap mania has signaled to the vast majority of American parents that school reform isn’t about their kids. They are now expected to support efforts to close the achievement gap simply because it’s “the right thing to do,” regardless of the implications for their own children’s education. In fact, given that only about one household in five even contains school-age children — and given that two-thirds of families with children do not live in underserved urban neighborhoods, or do not send their kids to public schools, or otherwise do not stand to benefit from the gap-closing agenda — the result is a tiny potential constituency for achievement-gap reform, made up of perhaps 6% or 7% of American households.

Because middle-class parents and suburbanites have no personal stake in the gap-closing enterprise, reforms are tolerated rather than embraced. The most recent annual Gallup poll on attitudes toward schooling reported that just 20% of respondents said “improving the nation’s lowest-performing schools” was the most important of the nation’s education challenges. Indeed, while just 18% of the public gave American schools overall an A or a B, a sizable majority thought their own elementary and middle schools deserved those high grades. The implication is that most Americans, even those with school-age children, currently see education reform as time and money spent on other people’s children.

Note to researchers: Most people reading your report are only thinking about how taxpayers can get improved results for the vast quantities of money we spend on low income students—or, at least, spend less money for the same results. They are not thinking about how this research affects their own kids because they know full well that it doesn’t.

Incidentally, I am pro-testing. But all the value-added testing research I’ve seen has been utterly pointless and ignores the reality of teaching low ability kids, and testing them with assessments that are far beyond their ability level.

And ultimately, I’m not convinced that the difference between most teachers matters all that much. On this point, at least, the new study confirms my beliefs.

Advertisements

About educationrealist


One response to “Value Added–None

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s

%d bloggers like this: