# Tag Archives: Barry Garelick

## Assessing Math Understanding: Max, Homer, and Wesley

This is only tangentially a “math zombies” post, but I did come up with the idea because of the conversation.

I agree with Garelick and Beals that asking kids to “explain math” is most often a waste of time. Templates and diagrams and “flow maps” aren’t going to cut it, either. Assessing understanding is a complicated process that requires several different solutions methods and an interpretive dance. Plus a poster or three. No, not really.

As I mentioned earlier, I don’t usually ask kids to “explain their answer” because too many kids confuse “I wrote some words” with “I explained”. I grade their responses in the spirit given, a few points for effort. “Explain your answer” test questions are sometimes handy to see if top students are just going through the motions, or how much of my efforts have sunk through to the students. But I don’t rely on them much and apart from top students, don’t care much if the kids can’t articulate their thinking.

It’s still important to determine whether kids actually understand the math, and not just because some kids know the algorithm only. Other kids struggle with the algorithm but understand the concepts, Still others don’t understand the algorithm because they don’t grok the concepts. Finally, many kids get overwhelmed or can’t be bothered to work out the problem but will indicate their understanding if they can just read and answer true/false points.

If you are thinking “Good lord, you fail the kids who can’t be bothered or get overwhelmed by the algorithms!” then you do not understand the vast range of abilities many high school teachers face, and you don’t normally read this blog. These are easily remediable shortcomings. I’m not going to cover that ground again.

So how to ascertain understanding without the deadening “explain your answer” or the often insufficient “show your work”?

My task became much easier once I turned to multiple answer assessments. I can design questions that test algorithm knowledge, including interim steps, while also ascertaining conceptual knowledge.

I captured some student test results to illustrate, choosing two students for direct comparison, and one student for additional range. None of these students are my strongest. One of the comparison students, Max, would be doing much better if he were taught by Mr. Singh, a pure lecture & set teacher; the other, Homer, would be struggling to pass. The third, Wesley, would have quit attending class long ago with most other teachers.

To start: a pure factoring problem. The first is Max, the second Homer.

Both students got full credit for the factoring and for identifying all the correct responses. Max at first appears to be the superior math student; his work is neat, precise, efficient. He doesn’t need any factoring aids, doing it all in his head. Homer’s work is sloppier; he makes full use of my trinomial factoring technique. He factored out the 3 much lower on the page (out of sight), and only after I pointed out he’d have an easier time doing that first.

Now two questions that test conceptual knowledge:

Max guessed on the “product of two lines” question entirely, and has no idea how to convert a quadratic in vertex form to standard or factored. Yet he could expand the square in his head, which is why he knew that c=-8. He was unable to relate the questions to the needed algorithms.

Homer aced it. In that same big, slightly childish handwriting, he used the (h,k) parameters to determine the vertex. Then he carefully expanded the vertex form to standard form, which he factored. This after he correctly identified the fact that two lines always multiply to form a quadratic, no matter the orientation.

Here’s more of Homer’s work, although I can’t find (or didn’t take a picture of) Max’s test.

This question tests students’ understanding of the parameters of three forms of the quadratic: standard, vertex, factored. I graded this generously. Students got full credit if they correctly identified just one quadratic by parameter, even if they missed or misidentified another. Kids don’t intuitively think of shapes by their parameter attributes, so I wanted to reward any right answers. Full credit for this question was 18 points. A few kids scored 22 points; another ten scored between 15 and 18. A third got ten or fewer points.

Homer did pretty well. He was clearly guessing at times, but he was logical and consistent in his approach. Max got six points. He got a wrong, got b, c, & d correct, then left the rest blank. It wasn’t time; I pointed out the empty responses during the test, pointing out some common elements as a hint. He still left it blank.

On the same test, I returned to an earlier topic, linear inequalities. I give them a graph with several “true” points. Their task: identify the inequalities that would include all of these solutions.

(Ack: I just realized I flipped the order when building this image. Homer’s is the first.)

Note the typo that you can see both kids have corrected (My test typos are fewer each year, but they still happen.) I just told them to fix it; the kids had to figure out if the “fix” made the boundary true or false. (This question was designed to test their understanding of linear concepts–that is, I didn’t want them plugging in points but rather visualizing or drawing the boundary lines.)

Both Max and Homer aced the question, applying previous knowledge to an unfamiliar question. Max converted the standard form equation to linear form, while Homer just graphed the lines he wasn’t sure of. Homer also went through the effort of testing regions as “true”, as I teach them, while Max just visualized them (and probably would have been made a mistake had I been more aggressive on testing regions).

Here I threw something they should have learned in a previous year, but hadn’t covered in class:

Most students were confused or uncertain; I told them that when in doubt, given a point….and they all chorused “PLUG IT IN.”

This was all Max needed to work the problem correctly. Homer, who had been trying to solve for y, then started plugging it in, but not as fluently as Max. He has a health problem forcing him to leave slightly early for lunch, so didn’t finish. For the next four days, I reminded students in class that they could come in after school or during lunch to finish their tests, if they needed time. Homer didn’t bother.

So despite the fact that Homer had much stronger conceptual understanding of quadratics than Max, and roughly equal fluency in both lines and quadratics, he only got a C+ to Max’s C because Homer doesn’t really care about his grade so long as he’s passing.

Arrgghhh.

I called in both boys for a brief chat.

For Max, I reiterated my concern that he’s not doing as well as he could be. He constantly stares off into space, not paying attention to class discussions. Then he finishes work, often very early, often not using the method discussed in class. It’s fine; he’s not required to use my method, but the fact that he has another method means he has an outside tutor, that he’s tuning me out because “he knows this already”. He rips through practice sheets if he’s familiar with the method, otherwise he zones out, trying to fake it when I stop by. I told him he’s absolutely got the ability to get an A in class, but at this point, he’s at a B and dropping.

Max asked for extra credit. He knew the answer, because he asks me almost weekly. I told him that if he wanted to spend more time improving his grade, he should pay attention in class and ask questions, particularly on tests.

We’ve had this conversation before. He hasn’t changed his behavior. I suspect he’s just going to take his B and hope he gets a different teacher next year who’ll make the tutor worth the trouble. At least he’s not trying to force a failing grade to get to summer school for an easy A.

Homer got yelled at. I expressed (snarled) my disappointment that he wouldn’t make the effort to be excellent, when he was so clearly capable of more. What was he doing that was so important he couldn’t take 20 minutes or so away to finish a test, given the gift of extra time? Homer stood looking a bit abashed. Next test, he came in during lunch to complete his work. And got an A.

Max got a B- on the same test, with no change in behavior.

I haven’t included any of the top students’ work because it’s rather boring; revelations only come with error patterns. But here, in a later test, is an actual “weak student”, who I shall dub Wesley.

Wesley had been forced into Algebra 2, against his wishes, since it took him five attempts to pass algebra I and geometry. He was furious and determined to fail. I told him all he had to do was work and I’d pass him. Didn’t help. I insisted he work. He’d often demand to get a referral instead. Finally, his mother emailed about his grade and I passed on our conversations. I don’t know how, but she convinced him to at least pick up a pencil. And, to Wesley’s astonishment, he actually did start to understand the material. Not all of it, not always.

This systems of equations question (on which many students did poorly) was also previous material. But look at Wesley! He creates a table! Just like I told him to do! It’s almost as if he listened to me!

He originally got the first equation as 20x + 2y = 210 (using table values); when I stopped by and saw his table, I reminded him to use it to find the slope–or, he could remember the tacos and burritos problem, which spurred his memory. You can’t really see the rest of the questions, but he did not get all the selections correct. He circled two correctly, but missed two, including one asking about the slope, which he could have found using his table. He also graphed a parabola almost correctly, above (you can see he’s marked the vertex point but then ignored it for the y-intercept).

He got a 69, a stupendous grade and effort, and actually grinned with amazement when I handed it back.

Clearly, I’m much better at motivating underachieving boys than I am “math zombies”. Unsurprising, since motivating the former is my peculiar expertise going back to my earliest days in test prep, and I’ve only recently had to contend with the latter. However, I’ve successfully reached out and intervened with similar students using this approach, so it’s not a complete failure. I will continue to work on my approach.

None of the boys have anything approaching a coherent, unified understanding of the math involved. In order to give them all credit for what they know and can do, while still challenging my strongest students, I have to test the subject from every angle. Assessing all students, scoring the range of abilities accurately, is difficult work.

As you can see, the challenges I face have little to do with Asperger’s kids who can’t explain what they think or frustrated parents dealing with number lines or boxes of 10. Nor is it anything solved by lectures or complex instruction. My task is complicated. But hell, it’s fun.

## Understanding Math, and the Zombie Problem

I have been mulling this piece on the evils of explanations for a while. There’s many ways to approach this issue, and I highly recommend the extended discussion at Dan Meyer’s blog, as it captures experience-based teachers (mostly reform biased) with the traditionalists, who are primarily not teachers.

What struck me suddenly, as I was engaged in commenting, was the Atlantic’s clever juxtaposition.

All the buzz, all the sturm und drang about Common Core and overprocessed math has involved elementary school. The cute show your thinking pictures are from 8 year olds and first graders. Louis CK breaks our hearts with his third grader’s pain. The image in the Atlantic article has cute little pudgy second grade arms—with just the suggestion of race, maybe black, maybe Hispanic, probably male—writing a whole paragraph on math. The evocative image evokes protective feelings, outrage over the iniquities of modern math instruction, as a probably male student desperately struggles to obey meaningless demands from a probably female teacher who probably doesn’t understand math beyond an elementary level anyway. Hence another underprivileged child’s potential crushed, early and permanently, by the white matriarchal power structure unwilling to acknowledge its limitations.

And who could disagree? Arithmetic has, as John Derbyshire notes, “the peculiar characteristic that it easy to state problems in it that are ferociously difficult to solve.” Why force children to explain place value or the division algorithm? Let them get fluency first. Garelick and Beals (henceforth referred to as G&B) cite various studies finding that elementary school students gain competence by focusing on procedure first, conceptual understanding at some later point.

There’s just one problem. While the Atlantic’s framing targets elementary school, and the essay’s evidence base is entirely from elementary school, G&B’s focus is on middle school.

Percentages. Proportions. Historically, the bane of middle school math. Exhibit C on high school math teachers list of “things our students should know but don’t” (after negatives and fractions), and an oft-tested topic, both conceptually and procedurally, in college placement.

G&B make no bones about their focus. They aren’t the ones who chose the image. They start off with a middle school example, and speak of middle school students who “just want to do the math”.

But again, there’s that authoritatively cited research (linked in blue here):

Again, all cites to research on elementary school math. The researched students are at most fifth graders; the topics never move above arithmetic facts. G&B even make it clear that the claim of “procedure without understanding is rare” is limited to elementary school math, and in the comments, Garelick discusses the limitations of a child’s brain, acknowledging that explanations become more important in adolescence—aka, middle school, algebra, and beyond.

G&B aren’t arguing for 8 year olds to multiply integers in happy, ignorant fluency, but for 14 year olds to calculate percentages and simply “show their work”. And in the event, which they deem unlikely, that students are just going through the motions, that’s okay because “doing a procedure devoid of any understanding of what is being done is actually hard to accomplish with elementary math.” Oh. Wait.

Once you get past the Atlantic bait and switch and discuss the issue at the appropriate age level, everything about the article seems odd.

First, Beals and Garelick would–or should, at least–be delighted with math instruction in 8th grade and beyond. Reform math doesn’t get very far in high school. Not only do most high school teachers reject reform math, most research shows that the bulk of advanced math teachers have proven impervious to all efforts to move beyond “lecture and assign a problem set”. Most math teachers at the high school level accept a worked problem as evidence of understanding, even when it’s not. I’m not as familiar with middle school algebra and geometry teachers, but since NCLB required middle school teachers to be subject-certified, it’s more likely they profile like high school teachers.

G&B don’t even begin to make the case that “explaining math” dominates at the middle school level. They gave an anecdote suggesting that 10% of the week’s math instruction was spent on 2-3 problems, “explaining thinking”.

This is the basis for an interesting discussion. Is it worth spending 10% of the time that would, presumably, otherwise be spent on procedural fluency on making kids jump through hoops to add meaningless detail to correctly worked problems? And then some people would say well, hang on, how about meaningful detail? Or how about other methods of assessing for understanding? For example, how about asking students why they can’t just increase \$160 by 20% to get the original coat price? And if 10% is too much time, how about 5%? How about just a few test questions?

But G&B present the case as utterly beyond question, because research and besides, Aspergers. And you know, ELL. We shouldn’t make sure they understand what’s going on, provided they they know the procedures! Isn’t that enough?

Except, as noted, the research they use is for younger kids. None of their research supports their assertion that procedural fluency leads to conceptual understanding for algebra and beyond. We don’t really know.

However, to the extent we do know, most of the research available in algebra suggests exactly the opposite–that students benefit from “sense-making”, conceptual approaches (which is not the same as discovery) as opposed to entirely procedural based instruction. But researching algebra instruction is far more difficult than evaluating the pedagogy of arithmetic operations—and forget about any research done beyond the algebra level. So G&B didn’t provide adequate basis for making their claims about the relative value of procedural vs conceptual fluency, and it’s doubtful the basis exists.

I’ll get to the rest in a minute, but let’s take a pause there. Imagine how different the article would be if G&B had acknowledged that, while elementary school research supports fact fluency over sense-making (and fact fluency seems to be helpful in advanced math), the research and practice at algebra and beyond is less well established. What if they’d argued for their preferences, as opposed to research-based practices, and made an effort to build a case for procedural fluency over comprehension in advanced math? It would have led to a much richer conversation, with everyone acknowledging the strengths and weaknesses of different strategies and choices.

Someday, I’d like to see that conversation take place. Not with G&B, though, since I’m not even sure they understand the big hole in their case. They aren’t experienced enough.

Then there’s the zombie quote, where Garelick and Beals most tellingly display their inexperience:

Yes, Virginia, there are “math zombies”.

In high school, math zombies are very common, particularly in schools with a diverse range of students and thus abilities. Experienced teachers commenting at Dan Meyer’s blog or the Atlantic article all confirm their existence. This piece is long enough without going into anecdotal proof of zombies. One can infer zombie existence by the ever-growing complaints of college math professors about students with strong math transcripts but limited math knowledge.

I’ve seen zombies in tutoring through calculus, in my own teaching through pre-calc. In lower level classes, I’ve stopped some zombies dead in their tracks, often devastating them and angering their parents. The zombies, obviously, are the younger students in my classes, since I don’t teach honors courses. Most of the zombies in my school don’t go through my courses.

Whether math zombies are a problem rather depends on one’s point of view.

There are many math teachers who agree with G&B, who rip through the material, explaining it both procedurally and conceptually but focus on procedural competence. They assign difficult math problems in class with lots of homework. Their tests are difficult but predictable. They value students who wrote the didactic contract with Dolores Umbridge’s nasty pen, etching it into their skin. They diligently memorize the cues and procedures, and obediently regurgitate the procedures, aping understanding without having a clue. There is no dawning moment of conceptual understanding. The students don’t care in the slightest. They are there for the A and, to varying degrees, play Clever Hans for math teachers interested only in correctly worked procedures and right answers. Left as an open issue is the degree to which zombies are also cheating (and if they cheat are they zombies? is also a question left for another day). For now, assume I’m referring to kids who simply go through the motions, stuffing procedures into episodic memory with nothing making it to semantic, all to be forgotten as soon as the test is over.

Math zombies enable our absurd national math expectations. Twenty or thirty years ago, top tier kids had less incentive to fake it through advanced math. But as AP Calculus or die drove our national policy (thanks, Jay Mathews!) and students were driven to start advanced math earlier each year, zombies were rewarded for rather frightening behavior.

G&B and those who operate from the presumption that math can easily be mastered by memorizing procedures, who believe that teachers who slow down or limit coverage are enablers, don’t see math zombies as a problem. They’re the solution. You can see this in G&B’s devotion and constant appeal to the test scores of China, Singapore, and Korea, the ur-Zombies and still the sublime practitioners of the art, if it is to be called that.

For those of us who disagree, zombies create two related problems. First, their behavior encourages math teachers and policy makers to raise expectations, increase covered material, accelerate instruction pace. They allow schools to pretend that half their students or more are capable of advanced, college level math in high school while simultaneously getting As in many other difficult topics. They lead to BC Calculus pass rates of 50% or more (because yes, the AP Calc tests reward zombie math). Arguably, they have created a distortion in our sense of what “college math” should be, by pretending that “college math” is easily doable by most high school students willing to put in some time.

But the related problem is even more of an issue, because the more math teachers and policies reward zombies, the more smart, intellectually curious non-zombies bow out of the game, decide they’ll go to a state school or community college. Which means zombie kids just aren’t numbered among the “smart” kids, they become the smart kids. They define what smart kids “are capable of”, because no one comes along later to measure what they’ve…well, not forgotten, but never really learned to start with. So people think it really is possible to take 10-12 AP courses and understand the material (as opposed to get a 5 on the AP), and that defines what they expect from all top rank students. Meanwhile, those kids–and I know many–are neither intellectually curious nor even “intelligent” as we’d define it.

The Garelick/Beals piece is just a symptom of this mindset, not a cause. They don’t even know enough to realize that most high school math is taught just the way they like it. They’d understand this better if they were teachers, but neither of them has spent any significant time in the classroom, despite their bio claims. Both have significant academic knowledge in related areas–Garelick in elementary math pedagogy, which he studied as a hobby, Beals as a language expert for Asperger’s—which someone at the Atlantic confused with relevant experience.

Such is the nature of discourse in education policy that some people will think I’m rebutting G&B. No. I don’t even disagree with them on everything. The push for elementary school explanation is misguided and wasteful. Many math teachers reward words, not valid explanations; that’s why I use multiple answer math tests to assess conceptual knowledge. I also would love–yea, love–to see my kids willing to work to acquire greater procedural fluency.

But G&B go far beyond their actual expertise and ultimately, their piece is just a sad reminder of how easy it is to be treated as an “expert” by major publications simply by having the right contacts and backers. Nice work if you can get it.

And the “zombie” allusion, further developed by Brett Gilland, is a keeper.