Tag Archives: math wars

Reform Math: An Isolationist’s View

For my sins, I periodically peruse the Method Math teacher blogs. I call them Method Math teachers because, much like those self-important thespians in the Actors Studio can’t just act, these guys can’t just teach. Not for them the order of a structured curriculum; no, they want “meaningful math”. They don’t want their kids to do well unless it’s the right kind of doing well. Do they love math? Do they have the proper respect and curiosity for math? What’s the student’s motivation?

They are correctly described as reform math teachers. In math, “reform” refers to the “progressive” side of the debate, in which math is not so much a field of study as it is an ideological value system. Discovery and complex instruction are the guiding lights of their lesson planning. However, since they are teachers, and most teachers don’t really care about education policy in any coherent sense, many teachers who embrace the tenets may not be aware of the ideological underpinnings of their chosen Method. They Like or Don’t Like, without much sense of anything beyond their classroom. (and in that, they are like most teachers).

Reform math is all about social justice, enabling blacks, Hispanics, and girls to “feel successful” about learning math. Actually being successful at learning math is a whole different thing; certainly these demographic categories are successful with their teachers, but when it comes to outside assessments, not so much—which is why reformers don’t much care for standardized tests. But in the classroom, constructivists and discovery-based lessons can accept multiple methods, which means no one method is wrong. And explaining! Explaining is vital. “Explaining your process” is the way that the “procedurally competent” kids (only in reform math is this a bad thing) can be flunked or at least marked down for not explaining their work, while other kids can find “other ways to be smart”. Convenient for grading, this value system allows teachers to dream up all sorts of ways for top kids to fail with the right answer, while tolerating all sorts of other ways for low ability kids to succeed with the wrong one.

(Ironically, these same people who focus on the importance of explaining the “why” are always insisting that teachers reduce the literacy demand of word problems, for kids who can’t read. That’s because the explaining aspect is meant to assist white girls weak on math but strong on literacy, whereas literacy reduction is all about making the problem set up easier for blacks and Hispanics to interpret.)

Reform math practitioners enthuse about this “open-ended discourse”, which avoids calculations and algorithms and, you know, answers. At least definitively right answers. Which teachers don’t give. Teacher explanations = failure. Hence Dan Meyer, the Lee Strasberg of the math blogosphere, famous for his Ted talk, has a blog that proudly bears the label “less helpful”.

Open-ended discourse requires curiosity and ability, which some might deem a feature, but the knowledgeable understand is a bug. Inquiry teaching deliberately eschews algorithms or process or anything resembling a structured approach (while allowing that “blind memorization” might occasionally be useful). Few reform teachers understand the underlying rationale for this method, which lies in the hope that open-ended problems will narrow the achievement gap—not by improving achievement of the lower half, but by narrowing all achievement into a much thinner band. Hence the importance of grading down the top students, and slowing (well, they call it deepening) instruction to be sure that no one is pulling too far ahead.

Ed schools ferociously pretend that all but a few racist fuddy-duddies teach using constructivist methods, but out in the real world, reform math is mostly fringe. The greatest penetration is at the suburban elementary school level, which has a teaching population disproportionately comprised of cheery young women who care more about their students’ interpersonal skills than intellectual development (a feature, not a bug). Complex instruction requires students to “share ideas and knowledge” and the strongest students are responsible for the weakest students’ learning, entrancing elementary school teachers with the delusion that math lessons can enhance social justice. Besides, elementary school teachers aren’t terribly strong at math to begin with, so a method that de-emphasizes algorithms reinforces their own preferences.

Since elementary school teachers rarely have the math chops to develop their own lessons, most reform curriculum development is found in middle school, where kids don’t stay long enough for the parents to complain and the teachers are knowledgeable—and teaching the subjects most attended to by reformers (pre-algebra, algebra, and geometry). So there’s a big support group and lots of material to build on.

Few high school math teachers embrace reform; those who are committed to the Method don’t have a long shelf-life. Most give it up after a few years, the rest show up at grad school where they can pretend that constructivism and complex instruction are valid, proven methods. They get a Phd and demand conformity from prospective teachers in ed school, successfully selling their dogma to a few eager apostles. These converts, alas, ultimately abandon the method or return to grad school where the cycle begins again. Thus Dan Meyer is no longer teaching math but getting his PhD at Stanford with Jo Boaler, Queen Mother of Reform Math. (Understand, however, that reformers do not practice what they preach in ed school. There aren’t multiple ways and many right answers when training new teachers. Heavens, no.)

To the extent reform math survives for any length of time, it does so in white, suburban elementary schools, although not without a struggle. Elementary teachers’ support is counterbalanced by well-educated parents who generally despise it. Parental protests have killed reform math programs at all levels for decades throughout the country. Districts have to balance happy teachers with howlingly angry parents. The high school battles ended over a decade ago, but elementary school parents have to deal with teachers who actually like the program.

But reform math wars are mostly a tale of suburban woes, as parents push back on well-meaning districts hoping to close the achievement gap of their bottom 10-20% by depressing their top performers. It stresses the parents out, but the kids will catch up. For all that reform math propagandists want to change the world for black and Hispanic kids, the techniques are abandoned quickly in high poverty, low ability schools, particularly at the high school level. The story goes like this: a complex instruction curriculum is introduced with great fanfare, math teachers complain, the complainers that can’t be fired are transferred, cue the fawning news coverage with much noise about “equity” and “access”, a few beaming parents who barely speak English talking about their children’s newfound love of math, clips of young black teens and Hispanic girls talking about how they like this math sooooo much better than “just being told what to do”…..and then the dismal state test scores come rolling in and all the canny zealots who once exhorted the grunts to be guides standing to the side are now publicly championing sage on the stage. Back comes explicit direct instruction and the cycle begins again.

The Jo Boaler brouhaha contains one such example, as James Milgram points out:

Indeed, a high official in the district where Railside is located called and updated me on the situation there in May, 2010. One of that person’s remarks is especially relevant. It was stated that as bad as [our work] indicated the situation was at Railside, the school district’s internal data actually showed it was even worse. Consequently, they had to step in and change the math curriculum at Railside to a more traditional approach. Changing the curriculum seems to have had some effect. This year (2012) there was a very large (27 point) increase in Railside’s API score and an even larger (28 point) increase for socioeconomically disadvantaged students, where the target had been 7 points in each case.

Railside High is San Lorenzo High School, in California. As Milgram says, its 2010 scores are dismal, while 2012 scores are improved. Not substantially—ain’t no getting around basic cognitive ability coupled with absurdly unrealistic expectations. But improved.

So I began this post to explain the tiny twitter tempest I began last week, and I’m not there yet. And to get there would take the post into specifics when thus far it’s been general. Sigh. For some reason, I’m writing very slowly this summer. But I didn’t have any clear description of reform math that I could link to in order to explain reform math as I see it, which is not quite as most critics see it.

If the Method teachers out there in the blogosphere do read this, they may confuse me for a traditionalist and, uh, no. My ed school is committed to complex instruction and inquiry-based learning, and I am very fond of my ed school. It’s not fond of me, of course, but then who is?

At that ed school, my all-discovery, all-inquiry, all-complex-instruction master teacher provided me with the best learning experience of my life, adopting effortlessly to my strengths and skepticisms to give me fantastic advice that I hark back to this day. I am, to put it mildly, Not Easy to Teach. That I got six months of valuable education counts for a lot. Thanks to that teacher’s willingness to focus on goals, not methods, I learned to do the same. I can find a lot of good in reform objectives, and steal interesting concepts in their lessons. I might think reform methods are awful but, like progressive educators in general, reformers are thinking about how to teach math, which as it happens is a subject much on my mind.

I am not and never will be a member of the Method group. I am Switzerland, or the US between world wars. Ignore the fact that my first year out I put my students in rows for three weeks until I couldn’t stand it anymore and put them in groups. Second year out I lasted 10 days. Third year out and beyond, I gave into the inevitable and just put the kids in groups from the start. I use manipulatives, introduce units not with facts but with activities that illustrate facts, minimize my use of algorithms, and always remind students that I’ll take a good estimate in lieu of a calculation (and give most of the credit). Anyone evaluating my teaching practice would conclude I have much more in common with the Method crew than I have with traditionalists.

Still, they are profoundly wrong, and their nonsense grates on me much more than the many ways in which the traditionalists err. That’s what led to my tweet, which yes, I still haven’t explained. But I’m ready to start explaining now, so that’s a step up.

Edited later to add:

A couple points. First, I welcome comments on this much because it will help me determine whether I’m getting the right ideas across. I know I can be tough on commenters who misinterpret me, much as I try not to, but I will really try not to if you want to complain about something you’ve misinterpreted on this particular post.

Second, I paint in broad brushes. Keep that in mind!

I’m going to try hard to get the second part of this up faster than normal, for me. I’m hoping for a couple days, but if I fail, know that I tried.


Jo Boaler’s Railside Study: The Schools, Identified. (Kind of.)

A brief, illustrative Jo Boaler anecdote by Dan Meyer, currently one of her doctoral students:

I was talking to Jo Boaler last night (name drop!) and she admitted she didn’t really get the whole blogging thing.

I laughed. Some background:

Jo Boaler, a Stanford professor, conducted a longitudinal study of three schools that’s widely known as the Railside paper. She presented the results to a standing room only crowd at the National Meeting of the National Council of Math Teachers in 2008, convincing almost everyone that “Railside” High School, a Title I, predominantly Hispanic high school outperformed two other majority white, more affluent schools in math thanks to the faculty’s dedication to problem-based integrated math, group work, and heterogeneous classes.

“Reform” math advocates, progressives whose commitment to heterogeneous classes has almost entirely derailed the rigor of advanced math classes at all but the most homogenous schools, counted this paper as victory and validation.

Three “traditionalists” were highly skeptical of Boaler’s findings and decided to go digging into the details: James Milgram, math professor at Stanford University, Wayne Bishop of CSU LA, and Paul Clopton, a statistician. They evaluated Boaler’s tests, the primary means by which Boaler demonstrated Railside’s apparently superior performance, and found them seriously wanting. They identified the schools and compared the various metrics (SAT scores, remediation rates) and demonstrated how Railside’s weak performance called Boaler’s conclusions into question. Their resulting paper, “A close examination of Jo Boaler’s Railside Report”, was accepted for publication in Education Next—and then Boaler moved to England. At that point, they decided not to publish the paper. All three men were heavily involved in math education and didn’t want to burn too many bridges with educators, who often lionize Boaler. One of the authors, James Milgram, a math professor at Stanford, posted the paper instead on his ftp site. Google took care of the rest.

The skeptics’ paper has stuck to Boaler like toilet paper on a stiletto heel; she’s written a long complaint about the three men’s “abusive” determination to get more information from her. From an Inside Higher Ed report on her complaint:

[S]he said she was prompted to speak out after thinking about the fallout from an experience this year when Irish educational authorities brought her in to consult on math education. When she wrote an op-ed in The Irish Times, a commenter suggested that her ideas be treated with “great skepticism” because they had been challenged by prominent professors, including one at her own university. Again, the evidence offered was a link to the Stanford URL of the Milgram/Bishop essay.

“This guy Milgram has this on a webpage. He has it on a Stanford site. They have a campaign that everywhere I publish, somebody puts up a link to that saying ‘she makes up data,’ ” Boaler said. “They are stopping me from being able to do my job.”

Boaler is upset because ordinary, every day, people aren’t merely taking her assertions at face value, but are instead challenging her authority with a link to a paper that, in her view, they shouldn’t even be able to read. So you can see why I laughed. This is a woman with absolutely no idea how the web works. “It’s not even peer-reviewed!!!” That people might find the ideas convincing and well-documented, with or without peer-review, isn’t an idea she’s really wrestled with yet.

Identifying the Schools
As I mentioned a while back, I had a strong reaction four years ago when reading an earlier work by Jo Boaler. A few months later, while still in ed school, I perused her Railside paper, which struck me as equally, er, not credible, a product of wishful deception, maybe? Or maybe just wishful thinking. I googled around to see if I was the only doubter and found the Milgram/Bishop/Clopton paper.

Railside High School

The article indicated that the three schools were identifiable. So I just googled algebra “bay area” boaler and in the first 2-3 pages I found this report on San Lorenzo High School:

San Lorenzo’s relationship with Stanford was based on their participation in a longitudinal study conducted by Professor Jo Boaler and her colleagues at the university. ….According to the CAPP liaison to the project, Weisberg, the researchers also found that SLHS math teachers rated high for their constructivist approach to teaching when compared to teachers at the other two high schools in their study.

Praised for their constructivist approach? In five minutes, I’d not only identified one of the schools. I’d identified the big Kahuna–Railside, the star of Boaler’s report, the school whose dedication to complex instruction, problem-based integrated math, and heterogeneous classes had propelled the Stanford professor to fame and glory. Bow to my greatness.

Happily, Boaler’s paper included CST scores for 2003, so I could match them up (as did MBC in their followup paper):

BoalerCSTScores

I could easily confirm that San Lorenzo High School CST scores for freshmen match exactly to Railside’s:
SLHS2003

(you can confirm here, it’s in Alameda County. The Algebra column for freshmen only. See? 1% 15% 33% 36% 15%. 188 students. )

San Lorenzo is an California East Bay suburb, so I’m not sure why Boaler would describe Railside as “an urban school”. California has any number of high poverty, Title I suburban schools.

One down, two to go. But the original MBC paper didn’t specify how the men identified the schools, and google gave too many possibilities for the other two study participants. Besides, I had other things to do, like find a teaching job, so I put away childish things.

Greendale High School

Then four years later, Jo Boaler complains and, in his response, James Milgram explains how they identified the schools:

We took the data above from Table 5, and one of us…checked the entire publicly available 2003 California STAR data-base, looking for schools for which any column was identical to one of the columns in Table 5. In each case we found that there was one and only one school that had that data.

Hey. I could do that. I had Access (the database), even. Which you need, because the CST file is too big for Excel.

Using this method, I identified the other two schools.

I downloaded the 2003 data to a text file, imported it to Access. I know mySQL’s interface but have never used Access before. I feel sure there’s an easier way than the path I took, which was to treat poor Access like Excel: go to the TestResults table, highlight the “total students tested” row, and search for 125, looking to the right for 0,6,27,55,12. It sounded something like this:

ClicknoClicknoClicknoClicknoClicknoClicknoClicknoClicknoClickno ClicknoClicknoClicknoClickWAITcrap that was it!go back! What, there’s no Reverse?Christ?Where was it?crapcrapcrapscrollbackscrollbackClicknoClickn…yes! There it is!

What, you don’t see it? Click to enlarge:

GD2003AccTableView

I found Greendale!! Whoohoo!

All I had to do was tab to the left a bit and look up the school’s identifying number. Then I went to the form in Access to look up the school and tada!

GD2003CST

According to Jo Boaler, “Greendale High School is situated in a coastal community, with very little ethnic or cultural diversity (almost all students are white).”.

Well, she’s half right. Greendale is definitely mostly white, but it’s in the mountains, not in the excessively wealthy mountains, in the much much much Greater Bay Area. Well, really, it’s juuuuust outside the much much much Greater Bay Area. Very pretty place. If you look at it on Google Maps, you would barely see blue, way off to the left.

It is not coastal.

Hilltop High School
Back to Access and clicknoclicknoclickgobackack!clicknoclickstop!tableft and there! I have Hilltop.
HT2003AccTbleView

Here are the CST scores to match:
HT2003CST

Boaler on Hilltop: “Hilltop High School is situated in a more rural setting, and approximately half of the students are Latino and half white.”

Demographics, right. Location, wrong. Boaler has just described Greendale’s location, not Hilltop’s. Find Hilltop’s town on a map and the blue is just to the left. One would describe Hilltop as “coastal”.

So Boaler flipped the school descriptions, but not the demographics? Was that on purpose, or an error?

I feel pretty confident, therefore, that in Boaler’s report:

  • Railside High School is San Lorenzo High School, in San Lorenzo. Title I school, mostly Hispanic.
  • Greendale High School is located in one of the mountain chains surrounding the Bay Area. Rural community, economically diverse, mostly white.
  • Hilltop High School is in a coastal community just outside the Bay Area, half Hispanic, half white. Greendale and Hilltop are not neighbors, but much closer to each other than either is to the edges of the Bay Area, much less San Lorenzo.

I originally planned to reveal the names of all three schools. I used publicly available data and Boaler’s own study to identify them. The schools have nothing to be embarrassed about. They participated in a study to help further knowledge about effective math instruction. How is that a bad thing? Their scores are already available on government website. Boaler isn’t directly critical of any school. No downside is immediately apparent, at least to me.

But still. In San Lorenzo High School’s case, their participation is easily searchable, so I identified the school. But the other two schools take quite a bit of work to find in Google, and the principals might not want to wake up and find their schools in a blog, even if the news wasn’t bad. This way, they can have some warning—again, with the understanding that this is publicly available data. Using Access is the cleanest way to find them, but at the end of this post I will give some other info to help interested people identify them.

So What Does This Mean?

Well, let’s assume that I didn’t miss schools with identical CST scores (I checked every entry, but who knows, I might have clicked too fast) and that these are, in fact, the schools in the study.

With just a bit of effort, interested parties can now review the Milgram/Bishop/Clopton report and confirm its claims about the overall math performance of the three schools. I’ve spot checked a lot of it, and I haven’t found any errors yet.

I’m not terribly detail-oriented, yet I saw two huge issues.

First, the 2003 CST data I matched up? Boaler provides this data as an external validator, showing how well the Railside kids did compared to the other two groups, thanks to the superior instruction of reform math. As is evident from the screen prints of the actual CST data that Boaler is using freshman 2003 data. But in Table 6, reproduced here:

table6year3

Boaler provides Year 3 data and clearly indicates that the students are juniors in 2003. The freshman algebra scores are not from her cohort. So why is she using this data as evidence of how great the program was? Shouldn’t she be using Algebra II data?

I went back two years to see what algebra scores were like, and discovered San Lorenzo High School (Railside) had fewer than ten freshmen taking algebra—in fact, the school has no math subject-specific scores at all. The other two schools did have freshmen algebra classes. So what, exactly, was Boaler comparing?

Milgram et al cover all of this in greater detail, and they also cover the other big red neon warning I see: if San Lorenzo High, which didn’t track, put all of its freshmen in algebra, while Greendale and Hilltop put their mid-to lower ability students in Algebra while the top freshmen took Geometry and Algebra II, then Boaler should not assert that San Lorenzo High is outperforming the other two schools based on freshman Algebra scores.

Of course, since she’s using the scores from the wrong cohort, she didn’t really demonstrate that the studied cohort from San Lorenzo HS outperformed the other two schools in the CST to begin with.

Why bother?

Like most mathematicians, MBC are vehemently opposed to reform math. Both Milgram and Bishop spend a lot of time working with parents or districts that are trying to get rid of reform curricula. In his rebuttal, Professor Milgram says,

Indeed, a high ranking official from the U.S. Department of Education asked me to evaluate the claims of [the Railside study] in early 2005 because she was concerned that if those claims were correct U.S. ED should begin to reconsider much if not all of what they were doing in mathematics education. This was the original reason we initiated the study, not some need to persecute Jo Boaler as she claims.

However, given both men’s determination to oppose reform math, and their willingness to work with parent groups organizing against reform math, Boaler believes, as Milgram says, that the paper was an attempt to discredit reform math, as opposed to an honest academic inquiry.

I have no opinion on that, but then I spend a lot of time on the Internet. MBC all seem pretty mild to me.

I’m not a traditionalist. I’ve written many times in this blog that for a pro-tracking, pro-testing discovery-averse teacher, I am stupendously squishy. Milgram, Bishop, Clopton, and Professor Wu would undoubtedly disapprove of my teaching methods. My kids sit in groups, I use a lot of manipulatives, I don’t lecture much or give notes, use lots of graphic organizers. To the extent I have a specialty, it lies in coddling low ability, low incentive kids through math classes whilst convincing them to learn something, and what they learn isn’t even close to the rigorous topics that real mathematicians want to see in math class. (Some lesson examples: real life coordinate geometry, modeling linear equations, triangle discovery, factoring trinomials, teaching trig and right triangles.) Nonetheless, I firmly believe that discovery, problem-based math, and complex instruction are ineffective with low to mid ability kids and think tracking or ability grouping is essential. So I’m not really tied to either camp in the math wars.

Besides, the math wars have largely been resolved. Lectures won’t work for low ability kids, but neither does discovery. High ability kids need fewer lectures, fewer algorithms, more open-ended problems, more challenges. Traditionalists have a lot of energy around reform math, but I think they could dial it back. For the most part, reform has lost in schools, particularly high schools.

Since Boaler will, if she acknowledges this post at all, complain about my motives, let me say that I am not a Boaler fan, but my disapproval is based purely on her opinions as revealed through her work: the Amber Hill/Phoenix Park paper, the Railside paper, and yeah, her recent bleat struck me as a big ol’ self-pity fest. But I’m not actively seeking to hurt her reputation, and while my tone is (cough) skeptical, I’m perfectly happy to learn that all of these questions I raise involve perfectly normal research decisions for academics.

However, I am constantly surprised at the unquestioning acceptance of educational research, particularly quantitative research.

Remember, this is a hugely significant paper in the math wars. Boaler is the hero who went out and “proved” that reform math gets better results. Suppose it’s academically acceptable for Boaler to assert that San Lorenzo High School algebra students outperformed the algebra students from two more affluent schools, based on the test results of students not in her study cohort. Would it nonetheless be important for education journalists to point out that the San Lorenzo students included the best students in the school, while the Greendale and Hilltop schools’ best students were in more advanced classes? And that a component of her success metric relied on scores of students who were two years behind her cohort?

To the extent I have an objective, there it is. Educational researchers may, in fact, engage in entirely acceptable behavior that nonetheless hides information highly relevant to the non-academic trying to use the research to figure out educational best practices.

Who’s responsible for bringing that information to light?

****************************************************************************

Identifying the schools

Ironically, when I was originally searching for the schools four years ago, I came across a link that identified Greendale. I just didn’t realize it for reasons that will be clearer once you find the link. Since MBC discuss the Greendale parents’ demand for a “traditional” program, and the school’s reluctant compliance, I tried to use that history to figure it out, googling (exactly): “interactive mathematics program” california high schools traditional. In the first couple pages, I found a link written by one of the MBC authors that references that parental demand as well. There are several schools mentioned in the paper, but only one of them is rural.

I’d also found a link with the Hilltop school in my initial search but had dismissed it, thinking the schools would all be in the Bay Area. But since MBC mentions that the school district forced Hilltop to cancel, I’d googled “interactive mathematics program” california district canceling. That will bring up, in the first two or three pages, a blog post from a once fairly well-known education specialty blogger (since gone inactive) on the school. This battle went on for some time, and the New York Times covered it earlier, but I won’t give the query for that.

A couple other clues: Many of Jo Boaler’s doctoral students posted in support of her complaint. An early supporter, who has a well-regarded math blog, taught at Greendale High School, although after the years of Jo Boaler’s study. That is probably not a coincidence. Jo Boaler thanks teachers in the paragraph in which she also mentions the schools that participated in her study. Maybe check out those teachers and see where they teach (or taught).


Boaler’s Bias (or BS)

I began this piece a week ago intending to opine on the Boaler letter. However, I realized I have to confess a strong bias: I read Boaler in ed school and nearly vomited all over my reader. And that will take a whole post.

Experiencing School Mathematics: Traditional and Reform Approaches to Teaching and Their Impact on Student Learning

Boaler, a Brit who has held math education academic positions in England as well as at Stanford, performed a three-year study of two English schools, matched up in demographics and test scores. Phoenix Park believed in progressive, student-centered instruction, whereas Amber Hill taught a traditionalist method—more than traditionalist, they taught math by rote and drill, which is by no means required for teacher-centered instruction.

Boaler was ostensibly investigating the two instruction methods, but the fix was clearly in. Despite Boaler’s constant assurances that the Amber Hill teachers were dedicated and caring, the school presents as an Orwellian fantasy:

One of the first things I noticed when I began my research was the apparent respectability of the school. Walking into the reception area on my arrival, I was struck by the tranquility of the arena. The reception was separated from the rest of the school by a set of heavy double doors. The floors were carpeted in a somber gray; a number of easy chairs had been placed by the secretary’s window and a small tray of flowers sat above them. …Amber Hill was unusually orderly and controlled. Students generally did as they were told, their behavior governed by numerous enforced rules and a general school ethos that induced obedience and conformity. All students were required to wear a school uniform, which the vast majority of students wore exactly as the regulations required. The annual school report that teachers sent home to parents required the teachers to give the students a grade on their “co-operation” and their “wearing of school uniform.” The head clearly wanted to present the school as academic and respectable, and he was successful in this aim at least in terms of the general facade. Visitors walking around the corridors would see unusually quiet and calm classrooms, with students sitting in rows or small groups usually watching the board. When students were unhappy in lessons, they tended to withdraw instead of being disruptive. The corridors were mainly quiet, and at break times the students walked in an orderly fashion between lessons. The students’ lives at Amber Hill were, in many ways, structured, disciplined, and controlled

(page 13)

Phoenix Park, on the other hand:

…had an attractive campus feel. The atmosphere was unusually calm—described in a newspaper article on the school as peaceful. Students walked slowly around the school, and there was a noticeable absence of students running, screaming, or shouting. This was not because of school rules; it seemed to be a product of the school’s overall ambiance. I mentioned this to one of the mathematics teachers one day and she agreed, saying that she did not think she had ever heard anybody shout—teacher or student. She added that this was particularly evident at break times in the hall: “The students are all so orderly, but no-one ever tells them to be.”…. Students were taught all subjects in mixed-ability groups. Phoenix Park students did not wear school uniforms. Most students wore fashionable but inexpensive clothes such as jeans, with trainers or boots, and shirts or t-shirts worn loosely outside. A central part of the school’s approach involved the development of independence among students. The students were encouraged to act responsibly—not because of school rules, but because they could see a reason to act in this way.

(emphasis mine) (page 18)

And yet, while the Amber Hill students were well-behaved little automatons, the Phoenix Park kids–the ones who simply behave well by choice and idealism, not some lower-class aspiration to respectability–ran amok:

In the 100 or so lessons I observed at Phoenix Park, I would typically see approximately one third of students wandering around the room chatting about non-work issues and generally not attending to the project they had been given. In some lessons, and for some parts of lessons, the numbers off task would be greater than this. Some students remained off task for long periods of time, sometimes all of the lessons; other students drifted on and off task at various points in the lessons. In a small quantitative assessment of time on task, I stood at the back of lessons and counted the number of students who appeared to be working 10 minutes into the lesson, halfway through the lesson, and 10 minutes before the end of the lesson. Over 11 lessons, with approximately 28 students in each , 69%, 64%, and 58% of students were on task, respectively [the corresponding numbers at Amber Hill were in the 90%s].
….
More important than either of these factors, however, is that the freedom the students experienced seemed to relate directly to the relaxed and non-disciplinarian nature of the three teachers and the school as a whole. Most of the time, the teachers did not seem to notice when students stopped working unless they became very disruptive. All three teachers seemed concerned to help and support students and, consequently, spent almost all of their time helping students who wanted help, leaving the others to their own devices.

(page 64, 65)

But far from criticizing the school for abysmal classroom management, Boaler blames the students.

However, this freedom was also the reason the third group of students hated the approach. Approximately one fifth of the cohort thought that mathematics was too open, and they did not want to be left to make their own decisions about their work. They complained that they were often left on their own not knowing what to do, and they wanted more help and structure from their teachers. The students felt that the school’s approach placed too great a demand on them—they did not want to use their own ideas or structure their own work, and they said that they would have preferred to work from books. What for some students meant freedom and opportunity, for others meant insecurity and hard work. There were approximately five students in each class who disliked and resisted the open nature of their work. These students were mainly boys and were often disruptive— not only in mathematics, but across the school. (page 68)

In every mathematics lesson I observed at Phoenix Park, between three and six students would do little work and spend much of their time disrupting others. I now try to describe the motivation of these 20 or so students, who represented a small but interesting group. The students who did little work in class were mainly boys, and they related their lack of motivation to the openness of the mathematical approach and, more specifically, the fact that they were often left to work out what they had to do on their own. …..Many of the Phoenix Park students talked about the difficulty they experienced when they firststarted at the school working on open projects that required them to think for themselves. But most of the students gradually adapted to this demand, whereas the disruptive students continued to resist it.

In Years 9 and 10, I interviewed six of the most disruptive and badly behaved students in the year group: five boys and one girl. They explained their misbehavior during lessons in terms of the lack of structure or direction they were given and, related to this, the need for more teacher help. These students had been given the same starting points as every-body else, but for some reason seemed unwilling to think of ways to work on the activities without the teacher telling them what to do. This was a necessary requirement with the Phoenix Park approach because it was impossible for all of the students to be supported by the teacher when they needed to make decisions. The students who did not work in lessons were no less able than other students; they did not come from the same middle school and they were socioeconomically diverse. In questionnaires, the students did not respond differently from other students, even on questions designed to assess learning style preferences. The only aspect that seemed to unite the students was their behavior and the fact that most of them were boys. The reasons that some students acted in this way and others did not were obviously complex and due to a number of interrelated factors. Martin Collins [one of the Phoenix Park teachers] believed that more of the boys experienced difficulty with the approach because they were less mature and less willing to take responsibility for their own learning than the girls. The idea that the boys were badly behaved because of immaturity was also partly validated by the improvement in the boys’ behavior as they got older .

(page 73) (emphasis mine)

Meanwhile, the Amber Hill girls were miserable:

All of the Amber Hill girls interviewed in Years 9 and 10 expressed a strong preference for their coursework lessons and the individualized booklet approach, which they followed in Years 6 and 7, as against their textbook work. The girls gave clear reasons why these two approaches were more appropriate ways of learning mathematics for them; all of these reasons were linked to their desire to understand mathematics. In conversations and interviews, students expressed a concern for their lack of understanding of the mathematics they encountered in class. This was particularly acute for the girls not because they understood less than the boys, but because they appeared to be less willing to relinquish their desire for understanding…..Just as frequently, I observed girls looking lost and confused, struggling to understand their work or giving up all together. On the whole, the boys were content if they attained correct answers. The girls would also attain correct answers, but they wanted more. The different responses of the girls and boys to group work related to the opportunity it gave them to think about topics in depth and increase their understanding through discussion. This was not perceived as a great advantage to the boys probably because their aim was not to understand, but to get through work quickly. These different responses were also evident in response to the students’ preferences for working at their own pace. In chapter 6, I showed that an overwhelming desire for both girls and boys at Amber Hill was to work at their own pace. This desire united the sexes, but the reasons boys and girls gave for their preferences were generally different. The boys said they enjoyed individualized work that could be completed at their own pace because it allowed them to tear ahead and complete as many books as possible….The girls again explained their preference for working at their own pace in terms of an increased access to understanding. The girls at Amber Hill consistently demonstrated that they believed in the importance of an open, reflective style of learning, and that they did not value a competitive approach or one in which there was one teacher-determined answer. Unfortunately for them ,the approach they thought would enhance their understanding was not attainable in their mathematics classrooms except for 3 weeks of each year .

(page 139)

(all emphasis mine)

So in each school, there were students who really hated the teaching method used. But Boaler blames the complex-instruction haters at Phoenix Park (of course, it’s just a coincidence they are mostly male), for their immaturity and disruption, because they didn’t like the open-ended discovery method she so vehemently approves of. Meanwhile, she not only sympathizes with the Amber Hill girls, poor dears, who didn’t like the procedure-oriented teaching method at their school, but continually slams the Amber Hill boys who do enjoy it because those competitive, goal-driven little twerps aren’t interested in learning math but just doing more problems than their pals.

It was at this point I threw my reader across the room.

Moreover, reading between the lines of Boaler’s screed shows clearly that both schools are doing what I would consider an utterly crap job of teaching math. Boaler also mentions Phoenix Park is the low achiever in its affluent school district, and both schools have dismal test scores (which, let me be clear, could be true even if both schools were doing an outstanding job in math instruction).

Indeed, Boaler’s entire thesis—that the “reform” approach leads to better test scores—is poorly supported by her own data. Boaler received special permission to evaluate the students’ individual GCSE scores. She coded problems as either “procedural” or “conceptual”.

Amber Hill, of the dull, grey school and the dreary uniforms, actually outscored Phoenix Park, the progressive’s paradise, on procedural questions. While Phoenix Park outscores Amber Hill on conceptual problems, it wasn’t by all that much.

Like any dedicated ideologue, Boaler misses the monster lede apparent in these representations: Phoenix Park’s score range is nearly double that of Amber Hill’s, suggesting that discovery-based math helps high ability kids, while procedural math helps low ability students. Low ability students lost out at Phoenix Park, because they couldn’t cope with the open-ended, unstructured approach. Boaler didn’t give a damn about those kids, because they were boys. Meanwhile, high ability kids do better with an open-ended approach, gaining a better understanding of math concepts.

This finding has been well-documented in subsequent research—at least, the research done by academics who aren’t hacks bent on turning math education into a group project. I wrote about this earlier.

Here, too, is a takedown of some of the specifics in her research. You can read the whole thing, but here are the primary points in direct quotes:

  • “Also these scores are very similar. A notable difference is that rather a lot of students at Amber Hill fail, whereas more students at Phoenix Park get the very low grades E,F,G. Boaler sees this as a positive thing about Phoenix Park. A possible explanation (which Boaler does not give) has to do with the fact that the GCSE is actually not one exam, but three exams….. it is perfectly conceivable that at Amber Hill many students aimed higher than they could achieve and failed. Note that it is essential for further education to receive at least a C, so that participating in the basic exam is virtually useless. The figures show that nonetheless at Phoenix Park at least 43.5 percent of the students (the Fs and Gs) participated in this exam and by doing this gave up their chance at higher education without even trying.”
  • “This indicates that, compared to the nation, the students at Phoenix Park did worse on the GCSE than they did on the NFER. So Phoenix Park seems not to have done its students a lot of good. The same is of course true for Amber Hill, which performed very similarly to Phoenix Park. I also took a look on the internet at typical average scores of schools on the GCSE. It seems that Phoenix Park and Amber Hill are just about the schools with the worst GCSE scores in the UK. I cannot help but think that Amber Hill was specifically chosen for this fact.”
  • “Boaler doesn’t say anything about the GCSE scores of Amber Hill at the moment that she decided to include this school in her study, but there is not reason to believe that it was markedly different from the above mentioned scores for Amber Hill. If that is the case, then Boaler seems to have been stacking the deck in favor of Phoenix Park and its discovery learning approach to mathematics teaching.”
  • “Boaler also doesn’t mention that the grades for the GCSE at both schools are lower than one would expect given the NFER scores. She seems determined to interpret everything in favor of Phoenix Park. ”

If you’ve read anything about the Boaler/Milgram/Bishop debate, some of these Boaler critiques may sound a tad familiar. But don’t get them confused. This is a different study. Which means Boaler has pulled this nonsense twice.

It was reading horror shows like Boaler that made me loathe progressive educators. It took me a while to acknowledge that they weren’t all dishonest hacks bent on distorting reality. Not all progressives are determined to create an ideological force field that repels all sane discussion of the genuine advantages and disadvantages of different educational approaches, and an honest acknowledgement that student cognitive ability—which appears unevenly distributed by both race *and* gender, at least as we measure it—is a factor in determining the best approach for a given student population. And ultimately, I find myself slightly more sympathetic to progressives than reformers because at least progressives (and here I include Boaler) actually know about teaching, even if they often do it with blinders on.

So getting all this out of my system means I’m not writing—yet—about Boaler/Milgram/Bishop. But then, I imagine my opinion’s pretty clear, isn’t it?

Ironically, I know people who know Boaler, and assure me she’s quite nice. But then, she’s British. It’s probably the accent.