# KIPP Mathematica Study and Bragging Rights

After snarling at diCarlo earlier, I can generally agree with his on the one hand, on the other hand analysis of the Mathematica KIPP study. But I have some quibbles:

They show meaningfully large relative gains in all major subjects and on multiple assessments, as well as in other types of outcomes, such as student and parent satisfaction (as is often the case, longer-term outcomes remain an open question).

Understand here I’m talking not as a researcher (which I’m not), but a regular, reasonably well-informed person. Compared to our goals, the gains aren’t that large:

For the full matching sample of 41 KIPP schools, the average impact three years after enrollment is 0.36 standard deviations in math, which is equivalent to moving the KIPP students in our sample from the 44th percentile to the 58th percentile (Figure IV.1).38 Another way of interpreting these impact estimates is to compare KIPP effect sizes to national norms regarding the amount of student academic growth that takes place during middle school (Bloom et al. 2008). Expressed this way, our impacts suggest that on average, KIPP middle schools produce approximately 11 months of extra learning growth in math after three years. For comparison, in study districts there is a gap of 0.90 standard deviations between the average math test scores of black students and white students; students eligible for reduced-price school meals have math scores that are an average of 0.77 standard deviations lower than other students.

The average impact of KIPP after three years in reading (0.21 standard deviations) is somewhat smaller than that for math—equivalent to moving the KIPP students in our sample the 46th to the 55th percentile. This is consistent with a variety of other studies that have found reading scores to be more difficult to move than math scores. In other words, the size of the math impact produced by KIPP schools after three years is equivalent to about 40 percent of the local black-white test score gap and 47 percent of the local achievement gap between higher and lower income students.

(from the study, emphasis mine).

See, that’s what passes for awesome, stop the presses news in educational miracle stories. Not “After three years, KIPP kids have closed the achievement gap.” We aren’t getting, “In high school, KIPP students are getting average scores of 1800 on the SAT”. We aren’t getting “KIPP students are taking and passing 4-5 AP tests” or “KIPP kids are reading at high school level when they enter high school.”

Those would be “large gains”. What we get–and again, I agree this is better than anyone else has managed—is a slightly narrowed gap.

I am not dismissing these results. I’m just sayin’…..well, wait. I said it earlier:

IF you take low ability kids (of any race or income) and IF you select for motivation in the parents, at least, and IF you remove the misbehaving or otherwise highly dysfunctional kids who don’t share their parents’ motivation, and IF you enforce strict behavioral indoctrination in middle class mores and IF you give them hundreds of hours more education a year and IF they are in middle school and IF they are simply being asked to catch up with the material that middle to high ability kids learned fairly effortlessly—that is, elementary reading and math skills…..

…then they will have a slightly better test scores than similarly motivated low ability kids stuck in classes with the misbehavers and highly dysfunctional kids and fewer hours of seat time and less behavioral indoctrination into middle class mores, but their underlying abilities will still be weak and just as far behind their higher ability peers as they were before KIPP.

We can and should discuss the possibility of unmeasured factors such as peer effects, but it seems unlikely that these factors would come close to explaining away the estimated impacts.

Um. What? I looked diCarlo up—no, never been a teacher. Okay. Because the KIPP improvements look exactly like what low ability kids could do if problem children weren’t allowed to daily obliterate classroom learning environments. The improvements don’t look like better teaching, better curriculum, or higher expectations. They aren’t miracles. They are the results of motivated, low ability, kids with caring, committed parents working hard in a rigidly disciplined environment and few distractions.

And, as di Carlo notes, it’s the discipline and the longer school day, not the higher expectations or culture, that made the difference. That, too, sounds a lot like peer effects.

I guess some people think that KIPP’s approach works with the problem kids and turns them into hard workers? Yeah, I’m laughing at that idea.

To over-generalize a bit, critics sometimes seem unwilling to acknowledge that KIPP’s results are real no matter how well-documented they might be, whereas some proponents are quick to use KIPP to proclaim a triumph for the charter movement, one that can justify the expansion of charter sectors nationwide.

I know of no KIPP critic who denies the results are real. They all make much the same point—a point that the Mathematical study observes as well:

Students at three quarters of KIPP schools and parents at about half are required to make participation commitments before students enroll, as reported by principals. The “Choice and Commitment” pillar emphasizes that students and parents have a choice to enroll in a KIPP school and that everyone at the school (leaders, teachers, students and parents) make a commitment to do their part to achieve success. After the admissions lottery determines which students are to be offered admission (if applicable), one way KIPP schools implement this principle is by asking parents and students to sign commitment agreements during a home visit conducted by school staff. Almost half of KIPP principals (48 percent) report that their schools have such participation requirements for parents, and principals at more than three-quarters of schools (76 percent) report that students must sign a responsibilities agreement. Principals at KIPP schools in the matched comparison analysis are significantly more likely to report these participation requirements for parents (61 percent) than principals at all KIPP schools.

(emphasis mine)

To KIPP critics, any reluctance to concede that this commitment requirement isn’t all, or at least most, of the ballgame is just asinine, and insulting to boot. A big difference between KIPP critics and KIPP supporters is the degree to which each group believes that public schools could make the same gains, if they could restrict their population to parents and kids willing to accept home visits and sign commitment agreements. KIPP proponents think that the principals with just 2.5 years experience and brand new teachers that can be fired whenever the brand new principals say so contribute to the gains. KIPP critics demur.

It’s very difficult to put a number on this, but it’s safe to say that this model is not a good fit for a very large proportion of students, regardless of background.

Don’t pussyfoot, di Carlo, spell it out: it’s safe to say this model is not a good fit for white or Asian kids, regardless of background. In fact, no research I know of examines KIPP’s impact on white or Asian kids of any income level, because white or Asian parents are unlikely to ever find KIPP attractive.

It also isn’t a good fit for many low achieving black or Hispanic students, of course. But it’s worth remembering that, apart from Jay Mathews when he’s in super-booster mode, no one seriously argues that the KIPP model is acceptable for all but a fraction of a percentage of white or Asian students.

Ultimately, though, di Carlo continues to skate the larger issue. KIPP proponents are claiming that KIPP gets its results from superior teaching and management—and, not incidentally, use their claims and the gains to attack public schools. KIPP’s critics argue that the results are due to skimming the kids with motivated parents, attrition of the discipline problems returning to public schools, fewer special ed or ELL kids, and KIPP’s freedom to ignore constitutional requirements that public schools have to abide by. (I would add one more caveat: I am highly skeptical that the KIPP middle school kids are doing well in high school, or we’d hear of it. But that’s just me.)

On these points, the Mathematica study offers more support to the critics than the proponents.

#### 4 responses to “KIPP Mathematica Study and Bragging Rights”

• Anthony

Please correct me if I’m reading this wrong, but it looks like the KIPP students began at the 45th percentile in both math and reading, within their district (Figure ES.1). That sounds like middle-ability, no? Or are these really low-scoring districts? Do you have a sense of what those scores would mean in a national context?
I also don’t understand this: “The typical KIPP student scored at the 45th percentile within the district in reading and math prior to entering KIPP, an achievement level significantly lower than the average in their own
elementary schools.” They’re taking mainly students who score about average for their districts, but very low for their schools? That makes it sound like entrants are coming from schools which are very high-scoring, relative to the districts they’re in. I would have guessed that the audience for KIPP would be kids who were in low-scoring schools, but were testing relatively well compared to their classmates.

• educationrealist

I can’t remember where I read a great takedown of the fact that KIPP is NOT dealing with the same kids. But yes, it looks as if KIPP schools are being set up in suburban districts, doesn’t it? It’d be nice to get clarity on that.

• Anthony

“In fact, no research I know of examines KIPP’s impact on white or Asian kids of any income level, because white or Asian parents are unlikely to ever find KIPP attractive.”

Why are white or asian parents unlikely to find KIPP attractive?