Tag Archives: algebra

Evaluating vs Solving

Most math teachers start their year with algebra review. I like the idea of “activating prior knowledge“, as it’s known in ed school, but I never want to revisit material as review. It’s so….boring. Similarly, others “reteach” students if they didn’t understand it the first time and again, no, I don’t do that.

The trick is to wrap the review material in something new, something small. It’s wrapping, after all. For example, suppose the kids don’t really get Power Laws 1, 2, and 3 the first time you teach them, even though you went through them in insane detail and taught them both method and meaning. But you give them a quiz, and half the class is like, what means this exponent stuff? so you grit your teeth, yell at them, flunk most of them on that quiz, and go onto another topic for a week or so. Then one morning write ¾ on the board and ask “How would I write this with exponents?” and through the explanation you take them back through all of the power laws.

But I’m not here to write about power laws, although if you want advice on the best way to teach them, even if it takes longer, there’s no better tutorial than Ben Orlin’s Exponential Bait and Switch.

I’m here to explain how I integrate what we usually call “algebra review” into my course, while additionally teaching them some conceptual stuff that, in my experience, helps them throughout the course. Namely, teach them the difference between evaluating and solving functions for specific values.

Evaluate–what is widely recognized as “plugging in”. Given an input, find the output. Evaluate is Follows P E MD AS rules–well, technically P F MD AS, but who can say that? Note–I am pretty sure that “evaluate” is a formal term, but google isn’t helpful on this point.

Solve–well, technically it’s “plugging in for y”, but no one really thinks of it that way. Given an output, find the input(s).  Follows the rules of Johnny Depp’s younger brother, SA MD E P. (I hope I retire before I have to update that cultural reference). And really, it’s SA MD F P, but again, who can say that?

Things that get covered in Evaluate/Solve:

  • Remind everyone once more that addition/subtraction and multiplication/division run left to right, not one before the other. SAMDEP reinforces that, as I put the S first for the mnemonic.
  • “Evaluate”–Evaluating purely arithmetic expressions is middle school math.  At this stage of the game, the task is “evaluate the equation with a given value of x”.
  • “Solve” –Solving is, functionally, working backwards, to undo everything that has been done to the input. Right now, they know how to “undo” arithmetic and a few functions. They’ll be expanding that understanding as the course moves forward.x
  • Hinted at but not made explicit yet: not all equations are written in function format. I believe that, given an equation like  3x + 2y = 12 or x2 + y2=25, the terminology is “given x=4, solve for y” or “given y=3, solve for x”, but I’m not enough of a mathie to be sure. Feel free to clarify in the comments.
  • As I move into functions, this framework is helpful for understanding that evaluating a function must have one and only one answer, whereas solving a function given an output can have more than one input. It’s also useful to start capturing the differences between absolute value and quadratics, which aren’t one to one, and lines and radicals, which are.
  • The “PE” in PEMDAS and SAMDEP stands for exponent, but in fact the laws must be followed for every type of function: square root, absolute value, trigonometry, logs,  and so on. Informally, the “E” means “do the function” or “undo the function”, depending on whether evaluating or solving. So evaluating y=4|x-5| -6  with x=1 means subtract 5 from 1 (the “parenthesis), then take the absolute value (the “exponent”), then multiply by 4 and subtract 6. Solving the same equation would be adding 6, dividing by four, then undoing the absolute value to create two equations, then adding five in each one. (This is more complicated in text than explaining it with calculations on a promethean.)
  • YOU CAN’T DISTRIBUTE OVER ANYTHING EXCEPT MULTIPLICATION. This one is important. Kids will change 2(x-1)2  to  (2x-2)2 to 4x-4  with depressing speed and while many of them will make the last mistake in perpetuity, I’ve found that I can break them of the first, which also helps with 3|x+5| not turning into|3x+15|. For some reason, they never distribute over a square root, but plenty will try to turn 3cos(3x) into cos(9x).

Here’s a bit of the worksheet I  built.

evalsolve

I have found this prepares the groundwork for an indepth introduction to functions, which is my first unit. So when they’ve finished Evaluate and Solve, followed by Simplify ( more on that later), the functions unit:

So by the end of the unit the students can graph f(x) = 2(x-1)2 – 8 , as well as find f(3) and a if f(a)=10, and understand that the x and y intercepts, if they exist, are at f(0) and f(x)=0. They can also do the same for a square root or reciprocal function. Then I do a linear unit and a quadratic unit in depth.

Function notation, particularly f(a)= [value], is much easier for the students to understand once they’ve worked “evaluate” and “solve” with x and y.

evalsolvfuncnot
This also helps the students read graphs for f(4) or f(z)=7.

evalsolvefunction

Back in January, a Swedish guy living in Germany, as he describes himself, read the vast majority of my blog and then summarized his key takeaways and some critiques. His 6 takeaways are a pretty good reading of my blog, but he’s completely dismissive of my teaching and pedagogy, saying I’m mathematically naive and often, due to my ignorance, end up creating more confusion teaching needless information to my students. He explicitly refers to The Evolution of Equals and The Product of Two Lines, but I suspect he’d feel similarly about The Sum of a Parabola and a Line and Teaching With Indirection.

I’m really sure my students aren’t confused. I get pretty decent feedback from real mathematicians. There are legit differences between teachers on this point that approach religious wars, so there’s that.

Besides, these sort of lessons do two things simultaneously. They give weaker kids the opportunity to practice, and the top kids get a dose of the big picture.

Yes, it’s been a while since I’ve written. Trying to fix that.

 


Designing Multiple Answer Math Tests

I got the idea for Multiple Answer Tests originally because I wanted to prepare my kids for Common Core Tests. (I’d rather people not use that post as the primary link, as I have done a lot more work since then.)

About six months later (a little over a year ago), I gave an update, which goes extensively into the grading of these tests, if you’re curious. At that time, I was teaching Pre-Calc and Algebra 2/Trig. This past year, I’ve been teaching Trigonometry and Algebra II. I’d never taught trig before, so all my work was new. In contrast, I have a lot of Algebra 2 tests, so I often rework a multiple choice question into a multiple answer.

I thought I’d go into the work of designing a multiple answer test, as well as give some examples of my newer work.

I design my questions almost in an ad hoc basis. Some questions I really like and keep practically intact; others get tweaked each time. I build tests from a mental question database, pulling them in from tests. So when I start a new test, I take the previous unit test, evaluate it, see if I’ve covered the same information, create new questions as needed, pull in questions I didn’t use on an earlier test, whatever. I don’t know how teachers can use the same test time and again. I’d get bored.

I recently realized my questions have a typology. Realizing this has helped me construct questions more clearly, sometimes adding a free response activity just to get the students started down the right path.

The first type of question requires modeling and/or solving one equation completely. The answer choices all involve that one process.

Trigonometry:

matrig1

I’m very proud of this question. My kids had learned how to graph the functions, but we hadn’t yet turned to modeling applications. So they got this cold, and did really well with it. (In the first class, anyway. We’ll see how the next group does in a month or so.) I had to design it in such a way to really telegraph the question’s simplicity, to convince the students to give it a shot.

Algebra II:
maratsimp

The rational expression question is incredibly flexible. I’m probably teaching pre-calc again next year and am really looking forward to beefing this question up with analysis.

Other questions are a situation or graph that can be addressed from multiple aspects. The student ends up working 2 or 3 actual calculations per question. I realized the questions look the same as the previous type, but they represent much more work and I need to start making that clear.

Trigonometry:

mypythruler

Algebra II:
mafurnquest

I love the Pythagorean Ruler question, which could be used purely for plane geometry questions, or right triangle trig. Or both. The furniture question is an early draft; I needed an inverse question and wanted some linear modeling review, so I threw together something that gave me both.

I can also use this format to test fluency on basic functions very efficiently. Instead of wasting one whole question on a trig identity, I can test four or five identities at once.

matrigalg

Or this one, also trig, where I toss in some simplification (re-expression) coupled with an understanding of the actual ratios (cosine and secant), even though they haven’t yet done any graphing. So even if they have graphing calculators (most don’t), they wouldn’t know what to look for.

matrigvals

I’m not much for “math can be used in the real world” lectures, but trigonometry is the one class where I can be all, “in your FACE!” when kids complain that they’d never see this in real life.

maisuzu

I stole the above concept from a trig book and converted to multiple answer, but the one below I came up with all by myself, and there’s all sorts of ways to take it. (and yes, as Mark Roulo points out, it should be “the B29’s circumference is blah blah blah.” Fixed in the source.)

mapropspeed

Some other questions for Algebra II, although they can easily be beefed up for pre-calc.

maparlinesys

maparabolaeq

One of the last things I do in creating a test is consider the weight I give each question. Sometimes I realize that I’ve created a really tough question with only five answer choices (my minimum). So I’ll add some easier answer choices to give kids credit for knowledge, even if they aren’t up to the toughest concepts yet.

That’s something I’ve really liked about the format. I can push the kids at different levels with the same question, and create more answer choices to give more weight to important concepts.

The kids mostly hate the tests, but readily admit that the hatred is for all the right reasons. Many kids used to As in math are flummoxed by the format, which forces them to realize they don’t really know the math as well as they think they do. They’ve really trained their brains to spot the correct answer in a multiple choice format–or identify the wrong ones. (These are the same kids who have memorized certain freeform response questions, but are flattened by unusual situations that don’t fit directly into the algorithms.)

Other strong students do exceptionally well, often spotting question interpretations I didn’t think of, or asking excellent clarifications that I incorporate into later tests. This tells me that I’m on the right track, exposing and differentiating ability levels.

At the lower ability levels, students actually do pretty well, once I convince them not to randomly circle answers. So, for example, on a rational expression question, they might screw up the final answer, but they can identify factors in common. Or they might make a mistake in calculating linear velocity, but they correctly calculate the circumference, and can answer questions about it.

I’ve already written about the frustrations, as when the kids have correctly worked problems but didn’t identify the verbal description of their process. But that, too, is useful, as they can plainly see the evidence. It forces them to (ahem) attend to precision.

Of course, I’m less than precise myself, and one thing I really love about these tests is my ability to hide this personality flaw. But if you spot any ambiguities, please let me know.


The Negative 16 Problems and Educational Romanticism

I came up with a good activity that allowed me to wrap up quadratics with a negative 16s application. (Note: I’m pretty sure that deriving the algorithm involves calculus and anyway, was way beyond the scope of what I wanted to do, which was reinforce their understanding of quadratics with an interesting application.) As you read, keep in mind: many worksheets with lots of practice on binomial multiplication, factoring, simpler models, function operations, converting quadratics from one form to another, completing the square (argghh) preceded this activity. We drilled, baby.

I told the kids to get out their primary quadratics handout:

parabolaforms

Then I showed two model rocket launches with onboard camera (chosen at random from youtube).

After the video, I toss a whiteboard marker straight up and caught it. Then I raised my hand and drop the marker.

“So the same basic equation affects the paths of this marker and those rockets–and it’s quadratic. What properties might affect—or be affected by—a projectile being launched into the air?”

The kids generated a list quickly; I restated a couple of them.

pmpfactors

Alexandra: “What about distance?”

I pretended to throw the marker directly at Josh, who ducked. Then I aimed it again, but this time angling towards the ceiling. “Why didn’t Josh duck the second time?”

“You wouldn’t have hit him.”

“How do you know?”

“Um. Your arm changed…angles?”

“Excellent. Distance calculations require horizontal angles, which involves trigonometry, which happens next year. So distance isn’t part of this model, which assumes the projectile is launched straight….”

“UP.”

“What about wind and weather?” from Mark.

“We’re ignoring them for now.”

“So they’re not important?”

“Not at all. Any of you watch The Challenger Disaster on the Science Channel?”

Brad snickered. “Yeah, I’m a big fan of the Science Channel.”

“Well, about 27 years ago, the space shuttle Challenger exploded 70 some seconds after launch, killing everyone on board when it crashed back to earth.” Silence.

“The one that killed the teacher?”

“Yes. The movie—which is very good—shows how one man, Richard Feynman, made sure the cause was made public. A piece of plastic tubing was supposed to squeeze open and closed—except, it turns out, the tubing didn’t operate well when it was really cold. The launch took place in Florida. Not a place for cold. Except it was January, and very cold that day. The tubing, called O-ring, compressed—but didn’t reopen. It stayed closed. That, coupled with really intense winds, led to the explosion.”

“A tube caused the crash?”

“Pretty much, yes. Now, that story tells us to sweat the small stuff in rocket launches, but we’re not going to sweat the small stuff with this equation for rocket launches! We don’t have to worry about wind factors or weather.”

“Then how can it be a good model?” from Mark, again.

“Think of it like a stick figure modeling a human being but leaving out a lot. It’sstill a useful model, particularly if you’re me and can’t draw anything but stick figures.”

So then we went through parameters vs. variables: Parameters like (h,k) that are specific to each equation, constant for that model. Variables–the x and y–change within the equation.

“So Initial Height is a parameter,” Mark is way ahead.

Nikhil: “But rocket height will change all the time, so it’s a variable.”

Alissa: “Velocity would change throughout, wouldn’t it?”

“But velocity changes because of gravity. So how do you calculate that?” said Brad.

“I’m not an expert on this; I just play one for math class. What we calculate with is the initial velocity, as it begins the journey. So it’s a parameter, not a variable.”

“But how do you find the initial velocity? Can you use a radar gun?”

“Great question, and I have no idea. So let’s look at a situation where you’ll have to find the velocity without a radar gun. Here’s an actual—well, a pretend actual—situation.”neg16question

“Use the information here to create the quadratic equation that models the rocket’s height. In your notes, you have all the different equation formats we’ve worked with. But you don’t have all the information for any one form. Identify what information you’ve been given, and start building three equations by adding in your known parameters. Then see what you can add based on your knowledge of the parabola. There are a number of different ways to solve this problem, but I’m going to give you one hint: you might want to start with a. Off you go.”

And by golly, off they went.

As releases go, this day was epic. The kids worked around the room, in groups of four, on whiteboards. And they just attacked the problem. With determination and resolve. With varying levels of skill.

In an hour of awesomeness here is the best part, from the weakest group, about 10 minutes after I let them go. Look. No, really LOOK!

net16classwork3

See negative 2.5 over 2? They are trying to find the vertex. They’ve taken the time to the ground (5 seconds) and taken half of it and then stopped. They were going to use the equation to find a, but got stuck. They also identified a zero, which they’ve got backwards (0,5), and are clearly wondering if (0,4) is a zero, too.

But Ed, you’re saying, they’ve got it all wrong. They’ve taken half of the wrong number, and plugged that—what they think is the vertex—into the wrong parameter in the vertex algorithm.. That’s totally wrong. And not only do they have a zero backwards, but what the hell is (0,4) doing in there?

And I say you are missing the point. I never once mentioned the vertex algorithm (negative b over 2a). I never once mentioned zeros. I didn’t even describe the task as creating an equation from points. Yet my weakest group has figured out that c is the initial height, that they can find the vertex and maybe the zeroes. They are applying their knowledge of parabolas in an entirely different form, trying to make sense of physical data with their existing knowledge. Never mind the second half—they have knowledge of parabolas! They are applying that knowledge! And they are on the right track!

Even better was the conversation when I came by:

“Hey, great start. Where’d the -2.5 come from?”

“It’s part of the vertex. But we have to find a, and we don’t know the other value.”

“But where’d you get 2.5 from?”

“It’s halfway from 5.”

Suddenly Janice got it.

“Omigod–this IS the vertex! 144 is y! 2.5 is x! We can use the vertex form and (h,k)!!”

The football player: “Does it matter if it doesn’t start from the ground?”

Me: “Good question. You might want to think about any other point I gave you.”

I went away and let them chew on that; a few minutes later the football player came running up to me: “It’s 2!” and damned if they hadn’t solved for a the next time I came by.

Here’s one of the two top groups, at about the same time. (Blurry because they were in the deep background of another picture). They’d figured out the vertex and were discussing the best way to find b.

neg16closeupclasswork

Mark was staring at the board. “How come, if we’re ignoring all the small stuff, the rocket won’t come straight back down? Why are you sure it’s not coming back to the roof?”

“Oh, it could, I suppose. Let me see if I can find you a better answer.” He moved away, when I was struck by a thought. “Hey….doesn’t the earth move? I mean yes, the earth moves. Wouldn’t that put the rocket down in a different place?”

“Is that it?”
“Aren’t you taking physics? Go ask your teacher. Great questions.”

I suggested taking a look at the factored form to find b but they did me one better by using “negative b over 2a” again and solving for b (which I hadn’t thought of), leading to Mark’s insight “Wait–the velocity is always 32 times the seconds to max height!”

The other kids had all figured out the significance of the vertex form, and were all debating whether it was 2.5 or 2 seconds, generally calling me over to referee.

One group of four boys, two Hispanics, one black, one Asian (Indian), all excellent students, took forever to get started, arguing ferociously over the vertex question for 10 minutes before I checked on them to see why they were calling each other “racist” (they were kidding, mostly). I had to chastise the winners for unseemly gloating. Hysterical, really, to see alpha males in action over a math problem. Their nearly-blank board, which I photographed as a rebuke:

neg16classwork4

The weaker group made even more progress (see the corrections) and the group to their left, middling ability, in red, was using standard equation with a and c to find b:
neg16fin3

My other top group used the same method, and had the best writeup:
neg16fin2

Best artwork had the model wrong, but the math mostly right:
neg16mid

  • All but one group had figured out they wanted to use vertex form for the starting point.
  • All but one group had kids in it that realized the significance of the 80 foot mark (the mirror point of the initial height)
  • All the groups figured out the significance of five seconds.
  • All the groups were able to solve for both a and b of the standard form equation.
  • The top three groups worked backwards to find the “fake” zero.
  • Two groups used the vertex algorithm to find b.
  • All the groups figured out that b had to be the velocity.

So then, after they figured it all out, I gave them the algorithm:

h(t)=-16t2 + v0t + s0.

Then I gave them Felix Baumgartner, the ultimate in a negative 16 problem.

And….AND!!!! The next day they remembered it all, jumping into this problem without complaint:projmotfollowup

Charles Murray retweeted my why not that essay, saying that I was the opposite of an educational romantic, and I don’t disagree. But he’s also tweeted that I’m a masochist for sticking it out—implying, I think, that working with kids who can’t genuinely understand the material must be a sad and hopeless task. (and if he’s not making that point, others have.) I noticed a similar line of thought in this nature/nurture essay by Tom Bennett says teachers would not write off a child with low grades as destined to stack shelves –implication that stacking shelves is a destiny unworthy of education.

The flip side of that reasoning looks like this: Why should only some students have access to a rich, demanding curriculum and this twitter conversation predicated on the assumption that low income kids get boring curricula with no rigor and low expectations.

Both mindsets have the same premise: education’s purpose is to improve kids’ academic ability, that education without improvement is soulless drudgery, whether cause or effect. One group says if you know kids can’t improve, what a dreary life teaching is. The other group says dreary teaching with low expectations is what causes the low scores—engage kids, better achievement. Both mindsets rely on the assumption that education is improvement.

Is it?

Suppose that in six months my weakest kids’ test scores are identical to the kids who doodled or slept through a boring lecture on the same material. Assume this lesson does nothing to increase their intrinsic motivation to learn math. Assume that some of the kids end up working the night shift at 7-11. Understand that I do make these assumptions.

Are the kids in my class better off for the experience? Was there value in the lesson itself, in the culmination of all those worksheets that gave them the basis to take on the challenge, in the success of their math in that moment? Is it worth educating kids if they don’t increase their abilities?

I believe the answer is yes.

Mine is not in any way a dreary task but an intellectual challenge: convince unmotivated students to take on advanced math—ideally, to internalize the knowledge for later recall. If not, I want them to have a memory of success, of achievement—not a false belief, not one that says “I’m great at math” but one that says “It’s worth a try”. Not miracles. Just better.

I would prefer an educational policy that set more realistic goals, gave kids more hope of actual mastery. But this will do in the meantime.

I have no evidence that my approach is superior, that lowering expectations but increasing engagement and effort is a better approach. I rely on faith. And so, I’m not entirely sure that I’m not an educational romantic.

Besides. It’s fun.


Core Meltdown Coming

I’ve stayed out of the Common Core nonsense. The objections involve much fuss about federal control, teacher training, curriculum mandates, and the constructivist nature of the standards. Yes, mostly. But so what?

Here’s the only important thing you need to know about Common Core standards: they’re ridiculously, impossibly difficult.

I will focus here on math, but I’m an English teacher too, and could write an equivalent screed for that topic.

I’m going to make assertions that, I believe, would be supported by any high school math teacher who works with students outside the top 30%, give or take.

Two to three years is required just to properly understand and apply proportional thinking–ratios and percentages. That’s leaving off the good chunk of the population that probably can’t ever truly understand it in non-concrete situations. Proportional thinking is a monster. That’s after two to three years spent genuinely understanding fraction operations. Then, maybe, they could get around to understanding the first semester of first year algebra–linear equations (slopes, more proportional thinking), isolating variables, systems, exponent laws, radicals—in a year or so.

In other words, we could use K-5 to give kids a good understanding in two things: fractions and integer operations. Put measurement and other nonsense into science (or skip it entirely, but then remember the one subject I don’t teach). Middle school should be devoted to proportional thinking, which will introduce them to variables and simple isolation procedures. Then expand what is currently first semester algebra over a year.

Remember, I’m talking about students outside the top 30% or so (who could actually benefit from more proportions and ratios work as well, but leave that for another post). We might quibble about the time frames and whether we could add a little bit more early algebra to the mix. But if a math teacher tells you this outline is nonsense, that if most kids were just taught properly, they could learn all this material in half the time, ask some questions about the demographic he works with.

Right now middle school math, which should ideally focus almost entirely on proportions, is burdened with introductions to exponents, a little geometry, some simple single variable equations. Algebra I has a whole second semester in which students who can’t tell a positive from negative slope are expected to master quadratics in all their glory and all sorts of word problems.

But Common Core standards add exponential functions to the algebra one course load and compensate by moving systems of equations and exponent laws to eighth grade while much of isolating variables is booted all the way down to sixth grade. Seventh grade alone bears the weight of proportions and ratios, and it’s one of several curricular objectives. So in the three years when, ideally, our teachers should be doing their level best to beat proportional thinking into students’ heads, Common Core expects our students to learn half of what used to be called algebra I, with a slight nod to proportional thinking (and more, as it turns out. But I’m getting ahead of myself).

But you don’t understand, say Common Core devotees. That’s exactly why we have these higher, more demanding standards! We’ve pushed back the timeline, to give kids more time to grasp these concepts. That’s why we’re moving introduction to fractions to third grade, and it’s why we are using the number line to teach fraction numeracy, and it’s why we are teaching kids that whole numbers are fractions, too! See, we’ve anticipated these problems. Don’t worry. It’s all going to be fine.

See, right there, you know that they aren’t listening. I just said that three to four YEARS is needed for all but the top kids to genuinely understand proportional thinking and first semester algebra, with nothing else on the agenda. It’s officially verboten to acknowledge ability in a public debate on education, so what Common Core advocates should have said, if they were genuinely interested in engaging in a debate is Oh, bullpuckey. You’re out of your mind. Four years to properly understand proportional thinking and first semester algebra? But just for some kids who aren’t “smart”? Racist.

And then we could have an argument that matters.

But Common Core advocates aren’t interested in having that debate. No one is. Anytime I point out the problem, I get “don’t be silly. Poor kids can learn.” I point out that I never mentioned income, that I’m talking about cognitive ability, and I get the twitter version of a blank stare somewhere over my shoulder. That’s the good reaction, the one that doesn’t involve calling me a racist—even though I never mentioned race, either.

Besides, CC advocates are in sell mode right now and don’t want to attack me as a soft bigot with low expectations. So bring up the difficulty factor and all they see is an opportunity to talk past the objection and reassure the larger audience: elementary kids are wasting their time on simple math and missing out on valuable instruction because their teachers are afraid of math. By increasing the difficulty of elementary school math, we will forcibly improve elementary school teacher knowledge, and so our kids will be able to learn the math they need by middle school to master the complex, real-world mathematical tasks we’re going to hand them in high school. Utterly absent from this argument is any acknowledgement that very few of the students are up to the challenge.

The timeline isn’t pushed back for algebra alone. Take a look at Geometry.

Geometry instruction has been under attack for quite some time, because teachers are de-emphasizing proofs and constructions. I’ve written about this extensively (see the above link, here, and here). Geometry teachers quickly learn that, with extensive, patient, instruction over two-thirds of their classes will still be completely incapable of managing a three step proof. Easy call: punt on proofs, which are hard to test with multiple choice questions. Skip or skate over constructions. Minimize logic, ignore most three dimensional figures (save surface area and volume formulas for rectangular prisms and maybe cylinders). Focus on the fundamentals: angle and polygon facts (used in combination with algebra), application of pythagorean theorem, special rights, right triangle trig, angle relationships, parallel lines, coordinate geometry. And algebra, because the train they’re on stops next at algebra II.

Lowering the course requirements is not only a rational act, but a sound curriculum decision: educate the kids in what they need to know in order to succeed pass survive have some chance of going through the motions in their next math class.

But according to everyone who has never worked with kids outside that 30%, these geometry teachers are lazy, poorly educated yutzes who don’t really understand geometry because they didn’t major in math or are in the bottom third of college graduates. Or, if they’re being charitable—and remember, Common Core folks are in sell mode, so charity it is—geometry teachers are just dealing with the results of low expectations and math illiterate elementary school teachers.

And so, the Common Core strategy: push half of geometry down to middle school.

Here’s what the Common Core declares: seventh graders will learn complementary and supplementary angles and area facts, and eighth graders will cover transversals, congruence, and similarity.

But wait. Didn’t Common Core standards already shove half of algebra down to middle school? Aren’t these students already learning about isolating variables, systems of equations, power laws, and proportions and ratios? Why yes, they are.

So by virtue of stuffing half of algebra and geometry content into middle school, high school geometry, as conceived by Common Core, is a stripped-down chassis of higher-order conceptual essentials: proofs, construction, modeling, measurement (3 dimensions only, of course), congruence and similarity, and right triangles.

Teachers won’t be able to teach to the lowest common denominator of the standards, not least because their students will now know the meaning of the lowest common denominator, thanks to Common Core’s early introduction of this important concept, but more importantly because the students will already know the basic facts of geometry, thanks to middle school. The geometry teachers will have no choice but to teach constructions, proofs, logic, and all the higher-order skills using those facts, the part of geometry that kids will need, intellectually, in order to be ready for college.

Don’t you see the beauty of this approach? ask the Common Core advocates. Right now, we try to cover all the geometry facts in a year. This way, we’re covering it in three years. Deeper understanding is the key!

High school math teachers treat Common Core much like people who ignored Obamacare until their policy got cancelled. We don’t much care about standards normally: math is math. When the teachers who work with the lower half of the ability spectrum really understand that the new, dramatically reduced algebra and geometry standards are based on the premise that kids will cover a good half of the math now supposedly covered in high school in middle school, that simply by the act of moving this material to middle school, the kids will understand this material deeply and thoroughly, allowing them, the high school teachers, to explore more important topics, they will go out and get drunk. I did that last year when I realized that my state actually was going to spend billions on these tests. I was so sure we’d blink at the money. But no, we’re all in.

Because remember, the low proficiency levels we currently have are not only based on less demanding standards, but they don’t include the kids who don’t get to second year algebra by their junior year. That is, of the juniors taking Algebra II or higher, on a much harder test, we can anticipate horribly low proficiency rates. But what about the kids who didn’t get that far?

In California (I’ll miss their reports), about 216,000 sophomores and juniors were taking either algebra I or geometry in 2012-2013. California doesn’t test its seniors, but to figure out how many seniors weren’t on track, we can approximate by checking 2011-12 scores, and see that about 128,000 juniors were taking either algebra I or geometry, which means they would not have been on track to take an Algebra II test as juniors. That is, in this era of low standards, the standards that Common Core will make even more rigorous, California alone has half a million students right now who wouldn’t have covered all the material by their junior year. So in addition to the many students who are at least on paper on track to take a test that’s going to be far too difficult for–at a conservative guess–half of them, we’ve got the many students who aren’t even able to get to that level of math. (Consider that each state will have to spend money testing juniors who aren’t taking algebra II, who we already know won’t be able to score proficient. Whoo and hoo.)

Is it Common Core supporter’s position that these students who aren’t in algebra II by junior year are by definition not ready for college or career? In addition to the other half million (416,000 or so) California students who are technically on track for Common Core but scored below basic or far below basic on their current tests? We don’t currently tell students who aren’t on track to take algebra II as juniors that they aren’t ready for college. I mean, they aren’t. No question. But we don’t tell them.

According to Arne Duncan, that’s a big problem that Common Core will fix:

We are no longer lying to kids about whether they are ready. Finally, we are telling them the truth, telling their parents the truth, and telling their future employers the truth. Finally, we are holding ourselves accountable to giving our children a true college and career-ready education.

If all we needed to do was tell them, we could do that now. No need for new standards and expensive tests. We could just say to any kid who can’t score 500 on the SAT math section or 23 on the ACT: Hey, sorry. You aren’t ready for college. Probably won’t ever be. Time to go get a job.

If we don’t have the gumption to do that now, what about Common Core will give us the necessary stones? Can I remind everyone again that these kids will be disproportionately black and Hispanic?

I can tell you one thing that Common Core math was designed to do—push us all towards integrated math. It’s very clear that the standards were developed for integrated math, and only the huge pushback forced Common Core standards to provide a traditional curriculum–which is in the appendix. The standards themselves are written in the integrated approach.

So one way to avoid having to acknowledge a group of kids who are by definition not ready for career and college would be to require schools to teach integrated math, as North Carolina has done. That way, we could mask it—just make sure all students are in something called Integrated Math 3 or 4 by junior year. If so, there’s a big problem with that strategy: American math teachers and parents both despise integrated math. I know of at least one school district (not mine) where math coaches spent an entire summer of professional development trying to convince the teachers to adopt an integrated curriculum. The teachers refused and the district reluctantly backed down. Few people have mentioned how similar the CC standards are to the integrated curriculum that Americans have consistently refused. But I do wonder if that was the appeal of an integrated curriculum in the Common Core push—it wouldn’t increase proficiency, but would make it less obvious to everyone how many students aren’t ready. (Of course, that would be lying. Hmm.)

At around this point, Common Core supporters would argue that of course it’s more than just not lying to the kids! It’s the standards themselves! They’re better! Than the lower ones! That more than half our kids are failing!

And we’ll only have to wait eight years to see the results!!!

Eight years?

Yeah, didn’t anyone mention this? That’s when the first year of third graders will become juniors, the first year in which Common Core magic will have run its full reign, and then we’ll see how great these higher standards really are! These problems—they just won’t be problems any more. These are problems caused by our lower standards.

Right.

Or: As we start to get nearer to that eight year mark, we’ll notice that the predictions of full bore Common Core proficiency isn’t signaling. With any luck, elementary school test scores will increase. But as we get nearer and nearer to high school, we’ll see the dreaded fadeout. Faced with results that declare a huge majority of our black and Hispanic students and a solid chunk of white and Asian students are unready for career and college, what will we do?

Naw. That’s eight years out! By that time, reformers will need a next New Thing to keep their donors excited, and politicians will have figured out the racial disproportionality of the whole college and career ready thing. We barely lasted ten years with No Child Left Behind, before we got waivers and the next New Thing. So what New New Thing will everyone be talking about five to six years out, what fingers will they be pointing, in which direction, to explain this failure? I don’t know. But it’s a good bet we’ll get another waiver.

Is it at all possible that the National Governors Association thought up the Common Core as a diversion, an escape route from the NCLB 100% proficiency trap? It’s not like Congress was ever going to get in gear.

But it’s an awfully expensive trap door, if so. Much cheaper to just devise some sort of Truth In Education Act that mandates accurate notification of college readiness, and avoid spending billions on tests and new materials.

Notice how none of this is a public conversation. At the public debate level, the only math-based Common Core opposition argues that the math standards are too easy.

At which point, I suddenly realize I need more beer.


Algebra 1 Growth in Geometry and Algebra II, Spring 2013

This is part of an ongoing series on my Algebra II and Geometry classes. By definition, students in these classes should have some level of competence in Algebra I. I’ve been tracking their progress on an algebra I pre-assessment test. The test assesses student ability to evaluate and substitute, use PEMDAS, solve simple equations, operate with negative integers, combine like terms. It tiptoes into first semester algebra—linear equations, simple systems, basic quadratic factoring—but the bulk of the 50 questions involve pre-algebra. While I used the test at my last school, I only thought of tracking student progress this year. My school is on a full-block schedule, which means we teach a year’s content in a semester, then repeat the whole cycle with another group of students. A usual teacher schedule is three daily 90-minute classes, with a fourth period prep. I taught one algebra II and one geometry class first semester (the third class prepared low ability students for a math graduation test), their results are here.

So in round two, I taught two Algebra 2 courses and one Geometry 10-12 (as well as a precalc class not part of this analysis). My first geometry class was freshmen only. In my last school, only freshmen who scored advanced or proficient on their 8th grade algebra test were put into geometry, while the rest take another year of algebra. In this school, all a kid has to do is pass algebra to be put into geometry, but we offer both honors and regular geometry. So my first semester class, Geometry 9, was filled with well-behaved kids with extremely poor algebra skills, as well as a quarter or so kids who had stronger skills but weren’t interested in taking honors.

I was originally expecting my Geometry 10-12 class to be extremely low ability and so wasn’t surprised to see they had a lower average incoming score. However, the class contained 6 kids who had taken Honors Geometry as freshmen—and failed. Why? They didn’t do their homework. “Plus, proofs. Hated proofs. Boring,” said one. These kids knew the entire geometry fact base, whether or not they grokked proofs, which they will never use again. I can’t figure out how to look up their state test scores yet, but I’m betting they got basic or higher in geometry last year. But because they were put into Honors, they have to take geometry twice. Couldn’t they have been given a C in regular geometry and moved on?

But I digress. Remember that I focus on number wrong, not number right, so a decrease is good.

Alg2GeomAlg1Progress

Again, I offer up as evidence that my students may or may not have learned geometry and second year algebra, but they know a whole lot more basic algebra than they did when they entered my class. Fortunately, my test scores weren’t obliterated this semester, so I have individual student progress to offer.

I wasn’t sure the best way to do this, so I did a scatter plot with data labels to easily show student before/after scores. The data labels aren’t reliably above or below the point, but you shouldn’t have to guess which label belongs to which point.

So in case you’re like me and have a horrible time reading these graphs, scores far over to the right on the x-axis are those who did poorly the first time. Scores low on the y-axis are those who did well the second time. So high right corner are the weak students at both beginning and end. The low left corner are the strong students who did well on both.

Geometry first. Thirty one students took both tests.

Spring2013GeomIndImprovement

Four students saw no improvement, another four actually got more wrong, although just 1 or 2 more. Another 3 students saw just one point improvement. But notice that through the middle range, almost all the students saw enormous improvement: twelve students, over a third, got from five to sixteen more correct answers, that is, improved from 10% to over 30%.

Now Algebra 2. Forty eight students took both tests; I had more testers at the end than the beginning; about ten students started a few days late.

Spring2013A2IndImprovement

Seven got exactly the same score both times, but only three declined (one of them a surprising 5 points—she was a good student. Must not have been feeling well). Eighteen (also a third) saw improvements of 5 to 16 points.

The average improvement was larger for the Algebra 2 classes than the Geometry classes, but not by much. Odd, considering that I’m actually teaching algebra, directly covering some of the topics in the test. In another sense, not so surprising, given that I am actually tasked to teach an entirely different topic in both cases. I ain’t teaching to this test. Still, I am puzzled that my algebra II students consistently show similar progress to my geometry students, even though they are soaked in the subject and my geometry students aren’t (although they are taught far more algebra than is usual for a geometry class).

I have two possible answers. Algebra 2 is insanely complex compared to geometry, particularly given I teach a very slimmed-down version of geometry. The kids have more to keep track of. This may lead to greater confusion and difficulty retaining what they’ve learned.

The other possibility is one I am reminded of by a beer-drinking buddy, a serious mathematician who is also teaches math: namely, that I’m a kickass geometry teacher. He bases this assertion on a few short observations of my classes and extensive discussions, fueled by many tankards of ale, of my methods and conceptual approaches (eg: Real-life coordinate Geometry, Geometry: Starting Off, Teaching Geometry,Teaching Congruence or Are You Happy, Professor Wu?, Kicking Off Triangles, Teaching Trig).

This possibility is a tad painful to contemplate. Fully half the classes I’ve taught in my four years of teaching—twelve out of twenty four—have been some form of Algebra, either actual Algebra I or Algebra I pretending to be Algebra II. I spend hours thinking about teaching algebra, about making it more understandable, and I believe I’ve had some success (see my various posts on modeling).

Six of those 24 classes have been geometry. Now, I spend time thinking about geometry, too, but not nearly as much, and here’s the terrible truth: when I come up with a new method to teach geometry, whether it be an explanation or a model, it works for a whole lot longer than my methods in algebra.

For example, I have used all the old standbys for identifying slope direction, as well as devising a few of my own, and the kids are STILL doing the mental equivalent of tossing a coin to determine if it’s positive or negative. But when I teach my kids how to find the opposite and adjacent legs of an angle (see “teaching Trig” above), the kids are still remembering it months later.

It is to weep.

I comfort myself with a few thoughts. First, it’s kind of cool being a kickass geometry teacher, if that is my fate. It’s a fun class that I can sculpt to my own design, unlike algebra, which has a billion moving parts everyone needs again.

Second, my algebra II kids say without exception that they understand more algebra than they ever did in the past, that they are willing to try when before they just gave up. Even the top kids who should be in a different class tell me they’ve learned more concepts than before, when they tended to just plug and play. My algebra 2 kids are often taking math placement tests as they go off to college, and I track their results. Few of them are ending up in more than one class out of the hunt, which would be my goal for them, and the best are placing out of remediation altogether. So I am doing something right.

And suddenly, I am reminded of my year teaching all algebra, all the time, and the results. My results look mediocre, yet the school has a stunningly successful year based on algebra growth in Hispanic and ELL students—and I taught the most algebra students and the most of those particular categories.

Maybe what I get is what growth looks like for the bottom 75% of the ability/incentive curve.

Eh. I’ll keep mulling that one. And, as always, spend countless hours trying to think up conceptual and procedural explanations that sticks.

I almost titled this post “Why Merit Pay and Value Added Assessment Won’t Work, Part IA” because if you are paying attention, that conclusion is obvious. But after starting a rant, I decided to leave it for another post.

Also glaringly on display to anyone not ignorant, willfully obtuse, or deliberately lying: Common Core standards are irrelevant. I’d be cynically neutral on them because hell, I’m not going to change what I do, except the tests will cost a fortune, so go forth ye Tea Partiers, ye anti-test progressives, and kill them standards daid.


Modeling Exponential Growth/Decay Interspersed with a Reform Rant

Quadratics have become my new nadir, which is cheerier news than it sounds since it means I’ve kicked linear equations into obedient submission. For the first two and a half years of my teaching career, I felt good about quadratics because if nothing else, most kids remembered how to factor, and remembered that factors had something to do with zeros on the graph. Which was a big step up compared to what they retained of linear equations. But then, last year, I cracked linear equations in a big way, which is great except now I just feel bad about quadratics, because as I develop as a teacher I realize the suckers are absurdly complicated and don’t model very easily. The kids learn a lot, but at their level of ability I’d need to do two months to have them internalize quadratics the way most of them internalize linear equations. And I don’t have two months. I just tell myself they still learn a lot. Consequently, I am relieved to see quadratics in the rear view as I move them onto the third of the models that define second year algebra (at least, as I teach it).

Exponential functions are awesome. First, they’re absurdly simple compared to both lines and quadratics. Second, they model actual, honest to god, real life situations. I’m not a big teacher for “Hey, this is something you’ll use again” but automobile depreciation or interest payments are, in fact, something they’ll use again. Third, they provide a memorable and again, useful, reason to review (or learn for the first time) percentage increase and decrease. Finally, they present a situation in which any kid who has even somewhat grasped the course essentials can see hey: Given y, I can’t solve for x. This leads beautifully and meaningfully into logarithms.

So like linear equations, I can kick off the unit with a modeling activity and get the kids moving easily into the math.

I begin with a brief lecture reminding them of the two previous models.

ExpBoardwork1

No. Quadratics aren’t repeated multiplication. Exponential functions involve repeated multiplication, as they’ll see in the lesson.

Then I review percentage increase and decrease. I am of two minds about this review. On the plus side, it’s immediately relevant, easy to apply, and gives them a good reason to remember it long term. The downside: the kids never remember what I taught them when they get to the percentage problems. So I explain it up front, knowing that 90% of the kids will forget everything I said just 20 minutes later, when they get to the first percentage exponential increase. Increase%age

So I explain it, go round the room asking “So, if I want to increase a number by 8%, what do I multiply it by, Jose?” “1 point…..8?” “Watch that leading zero!” “Oh, 1.08.” “Right.” Do that with five or six times, think everyone gets it, and set them to working on models. This is one side of the worksheet, crunched for space so I could “snip” it.

ExpGrowthWS

And sure enough, the kids work through the models, making great progress, and stop cold at the third one.

“I can’t do this. How do you increase by a percentage?”

“Excuse me while I beat myself on the head with this whiteboard.”

“What?”

“Nothing. Do you remember me just talking about percentages?”

“Yeah.”

“Do you see it on the board there? All the stuff about turning it from two steps into one step, and why you need to do that?”

“Yeah.”

“DO YOU SEE ANY POSSIBLE CONNECTION BETWEEN THAT CONVERSATION AND THIS PROBLEM?”

“Man, I don’t see why you’re so mean.”

“Read what it says on the board. Right there. In red.”

“Increase x by a%.”

“Yes. Can you read problem 3 and tell me what you think might possibly qualify as x?”

“The population?”

“Yes. And do you see the value that might possibly qualify as a%?”

“Um.” Long pause as the student stares at the problem, and finds the ONLY OTHER VALUE MENTIONED. “Twenty percent?”

“Indeed.”

“Okay.”

I repeat that four or five times to four or five groups and then, miracle of miracles, find a student with a full table of five values for the population problem. There is a god.

“Great.”

“But I don’t know how to find the equation for this one like I did the first two. This one isn’t repeated multiplication. I had to take 20% of 250 and then add it….why are you hitting yourself on the head?”

“We need a function. We need an operation in which we can plug in x—do you have any thoughts on what x might be?”

“How many months?”

“How is it you know that, you smart child, and yet make me go through this torture? Yes. We need an operation that we can plug in the number of months (x) and get the population (y).”

“Right. But this is like three steps.”

“And we need only one.”

“Right.”

“Wouldn’t it be cool if there were a way to increase a number by a given percentage in just one step?”

“How do you do that?”

“LOOK AT THE BOARD!”

“Oh, is that what you were talking about? I was already doing the worksheet.”

And still, the lesson is largely a success. Kids are absolutely freaked out at the cell growth caused just by doubling and yes, I bring up the million dollar mission example, but at the end of the lesson, not as part of it. Most of the kids correctly graph the models, although a few end up with lines that I correct. The flip side of the handout is a blank graph, which they use to take notes on the basic exponential growth model.

Total Amount = Initial Amount * Ratetime

Initial Amount > 0
Rate > 1

One thing I mull over—the book, and the state test, go through the exponential equation (basically, Initial Amount = 1), along with the transformation model (f(x) = ax-c +- k. I haven’t focused on this in previous classes, because in my experience the kids don’t even get tranformations of lines and quadratics. But I’m going to give it a try on Monday.

Anyway. Day 2 is exponential decay, but I start by going over percentage decrease. I am nothing if not optimistic.
Decrease%age

“So if I take away a third of something, how much is left?”

Pause. Pause some more. Pause still more. I grab three whiteboard pens.

“Rhea, decrease these pens by a third.” Rhea obediently takes one pen.

“Class, how much is left after she decreased the pens by 33%, or a third?”

“TWO!!!”

“Two……?” I wait. No. I sigh, and grab three more pens, getting the one back from Rhea as well.

“Paul, take away a third of these six pens.” Paul takes two pens.

“Class, he’s taken away 33% of the pens. How much is left?”

“FOUR!”

“AUUGGGGHHH!”

It all works out. Seriously. By the end of the exercise, most of the class is shouting back the correct answers as I ask “I take away 30%, how much is left?” 35%? 23%?” and the only mistakes they make are place errors—that is, 100-23 does not, in fact, equal 87.

The second day is always better, because it has slowly permeated their skulls that I’m serious about this percentage nonsense, that it has some relationship to the worksheet. So when they ask questions, it’s more of the “could you run this whole percentage decrease by me again? If they take away a third, I have two thirds left? But what’s two thirds as a decimal?” and trust me, this is a big step up for my blood pressure. Well, a step down. And they do the decay modeling and notes with no small degree of interest:
ExpDecayWS
They have the model graph on the back, too, for exponential decay:

Total Amount = Initial Amount * Ratetime

Yes, it’s the same equation, so what’s different?

Initial Amount > 0
0 < Rate < 1

By day’s end, they have registered the import of the realization that Estefania has 95 cents left after ten days, and they’ve figured out that Jose is right, that his car is worth more than Stan’s after five years, which they managed by using an equation they built themselves, by golly, rather than decrease 25,000 by 5% 5 times.

You notice, of course, that I’ve spent most of this post talking about the percentage issue, something the kids learned were first taught back in middle school, than the exponential growth/decay functions, the actual new material. This should not come as a shock to regular readers.

Back in March, there was much fuss about a study revealing that algebra and geometry classes aren’t rigorous enough.

Of course the classes aren’t rigorous enough. They can’t be. I refer you again to the false god of elementary school test scores and the Wise Words of Barbie.

This twitter debate between reformers Mike Petrilli and Rishawn Biddle is typical of reform debates about “rigor”. Petrilli wants end of course exams to stop us teachers from pretending to teach a subject. Biddle wants more of the same, just shout louder and MANDATE instruction, particularly to those disenfranchised black and Hispanic youth who are being let down by lousy teachers with low expectations.

Both of them assume that the problem is ineffective teaching, that all us math teachers could actually teach percentages and fractions to all seventh graders if we were just smarter and better. Or maybe they just think we take the easy way out, that it’d be really really hard to teach the kids properly, and what the hell, we get paid no matter what and behind close doors it’s easier to just go through the motions. Well, sure.

Petrilli’s proposal, end-of-course exams, would trigger a bloodbath. People really don’t seem to understand how I’d be all in favor of that, if the result were a rethinking of expectations. But of course, what would actually happen is that we’d end the end-of-course exams. That’s what always happens whenever a state or district tries to enforce higher standards (cf Oklahoma and now Texas). And of course, that’s what’s going to happen with Common Core standards, assuming that anyone actually takes them seriously after the testing bloodbath this year. But I’d be all for end-of-course testing if reformers would accept responsibility for the 80% decrease in graduation rates among blacks and Hispanics who would never get past algebra I and understand, finally, that they believe in a myth.

But I digress. And I’m still going to like exponential functions, at least until I crack quadratics. Because you know what? The kids do make progress in understanding percentages, and they learn for the first time not only about exponential functions, but about asymptotes, as I explain Zeno’s Paradox. I don’t use Achilles and the tortoise as an example, but instead talk about how I could throw a stapler right at BTS’s head and know that the stapler would never draw blood because it wouldn’t reach his noggin, so I couldn’t get fired. Or that I could walk to the door and never get there. I do get to the door, of course, and alas, the stapler would eventually crack BTS’s skull. But even though we know that this is true, the tools for proving the paradox false, as opposed to demonstrating it, don’t come around until calculus. They get a kick out of that.

If all that’s not fun enough, I see genuine, honest-to-god intellectual curiosity among most students as they realize that they don’t have the tools to isolate x in the equation 8 = 3x. That for all these years they’ve been getting along fine with addition/subtraction, multiplication/division, nth power/nth root, but none of those will work here. Which sets us up beautifully for both logs and a proper discussion of inverses, leading into inverse functions. Yes, their skills are still basic, but I can see the glimmering of understanding of the underlying concepts. If the damn state tests would just ask questions about those underlying concepts instead of demanding underlying concepts and advanced operations, I might even be able to get the kids to show that understanding.

And in writing up this essay, I am struck by the obvious solution to the percentage problem on day one: I need a worksheet. They fill it out, and not until they are done with that do I give them the worksheets on growth and decay. Naturally, this solution is again a lowering of expectations, a realization that a clear explanation on a blackboard that they can refer to isn’t enough, that I need to give fifteen to seventeen year olds an activity so the information will sink in and they use the method right away without asking me to explain it all again group by group. But to hell with expectations. It will be much better for my bloodpressure.


Modeling Linear Equations, Part 3

See Part I and Part II.

The success of my linear modeling unit has completely transformed the way I teach algebra.

From Part II, which I wrote at the beginning of the second semester at my last school:

In Modeling Linear Equations, I described the first weeks of my effort to give my Algebra II students a more (lord save me) organic understanding of linear equations. These students have been through algebra I twice (8th and 9th grade), and then I taught them linear equations for the better part of a month last semester. Yet before this month, none of them could quickly generate a table of values for a linear equation in any form (slope intercept, standard form, or a verbal model). They did know how to read a slope from a graph, for the most part, but weren’t able to find an equation from a table. They didn’t understand how a graph of a line was related to a verbal model—what would the slope be, a starting price or a monthly rate? What sort of situations would have a meaningful x-intercept?

This approach was instantly successful, as I relate. Last year, I taught the entire first semester content again in two months before moving on, and still got in about 60% of the Algebra II standards (pretty normal for a low ability class).

So when I began intermediate algebra in the fall, I decided to start right off with modeling. I just toss up some problems on the board–Well, actually, I start with a stick figure cartoon based on this lesson plan:

modelingsketch

I put it on the board, and ask a student who did middling poorly on my assessment test, “So, what could Stan buy?”

Shrug. “I don’t know.”

“Oh, come on. You’re telling me you never had $45 bucks and a spending decision? Assume no sales tax.”

Tentatively. “He could just buy 9 burritos?”

“Yes, he could! See? Told you you could do it. How many tacos could he buy?”

“None.”

At this point, another student figures it out, “So if he doesn’t buy any burritos, he could buy, like,…”

“Fifteen tacos. Why is it 15?”

“Because that’s how much you can buy for $45.”

“Anyone have another possibility? You? Guy in grey?”

Long pause, as guy in grey hopes desperately I’ll move on. I wait him out.

“I don’t know.”

“Really? Not at all? Oh, come on. Pretend it’s you. It’s your money. You bought 3 burritos. How many tacos can you get?”

This is the great part, really, because whoever I call on, and it’s always a kid who doesn’t want to be in the room, his brain starts working.

“He has $30 left, right? So he can buy ten tacos.”

“Hey, now, look at that. You did know. How’d you come up with ten?”

“It costs $15 to get three burritos, and he has $30 left.”

So I start a table, with Taco and Burrito headers, entering the first three values.

“And you know it’s $15 because….”

He’s worried it’s a trick question. “…it’s five dollars for each burrito?”

I force a couple other unwilling suckers to give me the last two integer entries

“Yeah. So see how you’re doing this in your head. You are automatically figuring the total cost of the burritos how?”

“Multiplying the burritos by five dollars.”

“And, girl over there, in pink, how do you know how much money to spend on tacos?”

“It’s $3 a taco, and you see how much left you have of the $45.”

“And again with the math in your head. You are multiplying the number of tacos by 3, and the number of burritos by….”

“Five.”

“Right. So we could write it out and have an actual equation.” And so I write out the equation, first with tacos and burritos, and then substituting x and y.

“This equation describes a line. We call it the standard form: Ax + By = C. Standard form is an extremely useful way to describe lines that model purchasing decisions.”

Then I graph the table and by golly, it’s dots in the shape of a line.

datamodelmockup

“Okay, who remembers anything about lines and slopes? Is this a positive or a negative slope?”

Silence. Of course. Which is better than someone shouting out “Positive!”

“So, guy over there. Yeah, you.”

“I wasn’t paying attention.”

“I know. Now you are. So tell me what happens to tacos when you buy more burritos.”

Silence. I wait it out.

“Um. I can’t buy as many tacos?”

“Nice. So what does that mean about tacos and burritos?”

At this point, I usually get some raised hands. “Blue jersey?”

“If you buy more tacos, you can’t buy as many burritos, either.”

“So as the number of tacos goes up, the number of burritos…”

“Goes down.”

“So. This dotted line is reflecting the fact that as tacos go up, burritos go down. I ask again: is this slope a positive slope or a negative slope?” and now I get a good spattering of “Negative” responses.

From there, I remind them of how to calculate a slope, which is always great because now, instead of it just being the 8 thousandth time they’ve been given the formula, they see that it has direct relevance to a spending decision they make daily. The slope is the reduction in burritos they can buy for every increased taco. I remind them how to find the equation of a slope from both the line and the table itself.

“So I just showed you guys the standard form of a line, but does anyone remember the equation form you learned back in algebra one?”

By now they’re warming up as they realize that they do remember information from algebra one and earlier, information that they thought had no relevance to their lives but, apparently, does. Someone usually comes up with the slope-intercept form. I put y=mx+b on the board and talk the students through identifying the parameters. Then, using the taco-burrito model, we plug in the slope and y-intercept and the kids see that the buying decision, one they are extremely familiar with, can be described in math equations that they now understand.

So then, I put a bunch of situations on the board and set them to work, for the rest of that day and the next.

DataModelingstart

I’ve now kicked off three intermediate algebra classes cold with this approach, and in every case the kids start modeling the problems with no hesitation.

Remember, all but maybe ten of the students in each class are kids who scored below basic or lower in Algebra I. Many of them have already failed intermediate algebra (aka Algebra II, no trig) once. And in day one, they are modeling linear equations and genuinely getting it. Even the ones who are unhappy (more on that in a minute) are getting it.

So from this point on, when a kid sees something like 5x + 7y = 35, they are thinking “something costs $5, something costs $7, and they have $35 to spend” which helps them make concrete sense of an abstract expression. Or y = 3x-7 means that Joe has seven fewer than 3 times as many graphic novels as Tio does (and, class, who has fewer graphic novels? Yes, Tio. Trust me, it’s much easier to make the smaller value x.)

Here’s an early student sample, from my current class, done just two days in. This is a boy who traditionally struggles with math—and this is homework, which he did on his own—definitely not his usual approach.

StudentSampleModeling

Notice that he’s still having trouble figuring out the equation, which is normal. But three of the four tables are correct (he struggles with perimeter, also common), and two of the four graphs are perfect—even though he hasn’t yet figured out how to use the graph to find the equation.

So he’s doing the part he’s learned in class with purpose and accuracy, clearly demonstrating ability to pull out solutions from a word model and then graph them. Time to improve his skills at building equations from graphs and tables.

After two days of this, I break the skills up into parts, reminding the weakest students how to find the slope from a graph, and then mixing and matching equations with models, like this:

MixandMatch2

So now, I’m emphasizing stuff they’ve learned before, but never been able to integrate because it’s been too abstract. The strongest kids in the class are moving through it all much faster, and are often into linear inequalities after a couple weeks.

Then I bring in one of my favorite handouts, built the first time I did this all a year ago: ModelingDatawithPoints. Back to word models, but instead of the model describing the math, the model gives them two points. Their task is to find the equation from the points. And glory be, the kids get it every time. I’m not sure who’s happier, them or me.

At some point in the first week, I give them a quiz, in which they have to turn two different models into tables, equations, and graphs (one from points), identify an equation from a line, identify an equation from a table, and graph two points to find the equation. The last question is, “How’s it going?”

This has been consistent through three classes (two this semester, one last). Most of the kids like it a lot and specifically tell me they are learning more. The top kids often say it’s very interesting to think of linear equations in this fashion. And about 10-20% of the students this first week are very, very nervous. They want specific methods and explicit instructions.

The day after the quiz, I address these concerns by pointing out that everyone in the room has been given these procedures countless times, and fewer than 30% of them remember how to apply them. The purpose of my method, I tell them, is to give them countless ways of thinking about linear equations, come up with their own preferred methods, and increase their ability to move from one form to another all at once, rather than focusing in on one method and moving to another, and so on. I also point out that almost all the students who said they didn’t like my method did pretty well on the quiz. The weakest kids almost always like the approach, even with initially weak results.

After a week or more of this, I move onto systems. First, solving them graphically—and I use this as a reason to explitly instruct them on sketching lines quickly, using one of three methods:

SketchingLines

Then I move on to models, two at a time. Last semester, my kids struggled with this and I didn’t pick up on it until a month later. This last week, I was alert to the problems they were having creating two separate models within a problem, so I spent an extra day focusing on the methods. The kids approved, and I could see a much better understanding. We’ll see how it goes on the test.

Here’s the boardwork for a systems models.
systemsboard

So I start by having them generate solutions to each model and matching them up, as well as finding the equations. Then they graph the equations and see that the intersection, the graphing solution, is identical to the values that match up in the tables.

Which sets the stage for the two algebraic methods: substitution and combination (aka elimination, addition).

Phew.

Last semester, I taught modeling to my math support class, and they really enjoyed it:

SlopeWorkMathsupportdatamodel1Mathsupportdatamodel2MathSupportdatamodel3

Some sample work–the one on the far right is done by a Hispanic sophomore who speaks no English.

Okay, back at 2000 words. Time to wrap it up. I’ll discuss where I’m taking it next in a second post.

Some tidbits: modeling quadratics is tough to do organically, because there are so few real-life models. The velocity problems are helpful, but since they’re the only type they are a bit too canned. I usually use area questions, but they aren’t nearly as realistic. Exponentials, on the other hand, are easy to model with real-life examples. I’m adding in absolute value modeling this semester for the first time, to see how it goes.

Anyway. This works a treat. If I were going to teach algebra I again (nooooooo!) I would start with this, rather than go through integer operations and fractions for the nineteenth time.


Algebra 1 Growth in Geometry and Algebra II

Last September, I wrote about my classes and the pre-algebra/Algebra 1 assessment results.

My school covers a year of instruction in a semester, so we just finished the first “year” of courses. I start with new students and four preps on Monday. Last week, I gave them the same assessment to see if they’d improved.

Unfortunately, the hard drive on my school computer got wiped in a re-imaging. This shouldn’t have been a problem, because I shouldn’t have had any data on the hard drive, except I never got put on the network. Happily, I use Dropbox for all my curriculum development, so an entire year’s worth of intellectual property wasn’t obliterated. I only lost the original assessment results, which I had accidentally stored on the school hard drive. I should have entered the scores in the school grading system (with a 0 weight, since they don’t count towards the grade) but only did that for geometry, the only class I can directly compare results with.

My algebra II class, though, was incredibly stable. I only lost three students, one of whom got a perfect score—which the only new addition to the class also got, so balance maintained. The other two students who left got around 10-15 wrong, so were squarely in the average at the time. I feel pretty comfortable that the original scores didn’t change substantially. My geometry class did have some major additions and removals, but since I had their scores I could recalculate.

Mean

Median

Mode

Range
Original

just above 10

9.5

7

22
Recalculated

just below 10 (9.8)

8

7

22

I didn’t have the Math Support scores, and enough students didn’t take the second test that comparisons would be pointless.

One confession: Two Algebra II students, the weakest two in the class, who did no work, scored 23 and 24 wrong, which was 11 more than the next lowest score. Their scores added an entire point to the average wrong, increased the range by 14 points, and you know, I just said bye and stopped them from distorting the results the other 32 kids. (I don’t remember exactly, but the original A2 tests had five or six 20+ wrong scores.)

So here’s the original September graph and the new graph of January:

AlgtestAlgAssessyrend

The geometry class was bimodal: 0 and 10. Excel refused to acknowledge this and I wasn’t sure how to force it. The 10s, as a group, were pretty consistent—only one of them improved by more than a point. The perfect scores ranged from 8 wrong to 2 wrong on the first test.

geoalgclassgrowth

In short, they learned a lot of first year algebra, and that’s because I spent quite a bit of time teaching them first year algebra. In Algebra II, I did it with data modeling, which was a much more sophisticated approach than what they’d had before, but it was still first year algebra. In geometry, I minimize certain standards (proofs, circles, solid shapes) in favor of applied geometry problems with lots of algebra.

And for all that improvement, a still distressing number of students answered x2 + 12 when asked what the product of (x+3) and (x+4) was, including two students who got an A in the class. I beat this into their heads, and STILL some of them forget that.

Some folks are going to draw exactly the wrong impression. “See?” these misguided souls will say, nodding wisely. “Our kids just aren’t being taught properly in early grades. Better standards, better teachers, this problems’s fixed! Until then, this poor teacher has to make up the slack.” In short, these poor fools still believe in the myth that they’ve never been taught.

When in fact, they were taught. Including by me—and I don’t mean the “hey, by the way, don’t forget the middle term in binomial multiplication”, but “you are clubbing orphan seals and making baby Jesus cry when you forget the middle term” while banging myself on the head with a whiteboard. And some of them just forgot anyway.

I don’t know how my kids will do on their state tests, but it’s safe to say that the geometry and second year algebra I exposed them to was considerably less than it would have been had their assessment scores at the beginning of class been the ones they got at the end of class. And because no one wants to acknowledge the huge deficit half or more of each class has in advanced high school math, high schools won’t be able to teach the kids the skills they need in the classes they need—namely, prealgebra for a year, “first year” algebra for two years, and then maybe some geometry and second year algebra. If they do okay on the earlier stuff.

Instead, high schools are forced to pretend that transcripts reflect reality, that all kids in geometry classes are capable of passing a pre-algebra test, much less an algebra one test. Meanwhile, reformers won’t know that I improved my kids’ basic algebra skills whilst still teaching them a lot of geometry/algebra II, because the tests they’ll insist on judging me with will assume a) that the kids had that earlier material mastered or b) that I could just catch them up quickly because after all, the only problem was the kids’ earlier teachers had never taught them.


Algebra Terrors

A day or so before the school year began, I went to an empty classroom that had a supply cabinet. This classroom was way better than mine. It was 3 or 4 feet wider, had shelving and a smart board. Now I didn’t care much about the smart board, but all smartboards have document cameras, which my room does not.

“Hey. This is a great room. Who gets it?”

“I guess the other new teacher.”

“When’s he coming?”

“I don’t think he’s even hired yet. You know, you should ask for this room! You’re here first.”

“Yeah, I think I will. It can’t hurt.”

So off I went to find an administrator, and the first one I I ran into was the AVP of discipline and scheduling. (As a sidenote, the conversation recorded below is the one of only two I’ve ever had with him.)

“Hi. Please don’t view this as a complaint of any sort, but I really like to teach with document cameras, and I notice that room 1170E has a smartboard. Hank (not his real name) suggested I ask if I could switch rooms?”

“To 1170E? Oh, yes, that’s for Ramon. I suppose I could switch rooms, but we’d need to change schedules as well. You see, we got the Promethean smartboard funding as part of our algebra initiative, and we committed to give those boards to teachers teaching Algebra I at least 60% of the time. If you’re interested….”

You know in Terminator 2, when Linda Hamilton has just finally broken out of her padded cell, broken Earl Boen’s arm, beaten the crap out of three security guards and is waiting for the elevator? Freedom is there, baby. She can taste it. She can get her son, escape to Mexico, stop the machines, save the world. All she needs is the ding of the elevator door.

Ding!

….and out of the elevator steps

Algebra I Arnold


NOOOOOOOOOOOOOOOooooooooooooooooo!
“I was told that you had expressed a strong preference to teach geometry and intermediate algebra. But I’m always happy to find interested algebra I teachers….”

“No, no, sir, no really. It’s fine. I do have a very strong preference to teach geometry and intermediate algebra, you were correctly informed, and I am happy with my current room. It’s fantastic. I can deal without a camera, it’s fine.”

“You’re sure?” Clearly, this man is an evil sadist. “You really do seem to like the document camera, and we prefer that the rooms go to teachers who will use them…”


No! No! I’ll stop! The machines can win! Take my son! Just don’t make me go back!!!

****************************************

It was just a bad scare. I’m teaching geometry and algebra 2. Well, Math Support, but even though the kids are weaker, I’d rather teach Math Support than Algebra I.

Math teachers think this story is very funny.

In retrospect, my second year of teaching was my most brutal, thanks to my schedule of all algebra I, all the time. I learned a lot. I never want to go back. Oh, sure, I’d like to teach one class of Algebra I, particularly to see if my data modeling lessons they work as well in algebra I as algebra II. But I do not want to be an “algebra I specialist”, and never, ever, EVER want to devote anything more than a class a year to algebra I. I’ve said it many times, but I’m always ready to bore folks: high school algebra I classes should convince anyone—from loopy liberal progressive to anti-teacher union tenure hating eduformer—that our educational policy is twisted and broken beyond all recovery.

So why bring it up now? Because until I saw this article, I’d forgotten the very worst part: A Double Dose of Algebra (ht: Joanne Jacobs).

Yes, I didn’t just teach straight algebra I classes. I taught a double class of Algebra Intervention. Let’s switch T2 characters for just a moment, shall we?

This is what happens when I’m reminded of that intervention class without time to prepare myself.

What is Algebra Intervention, or “double dose algebra”? Well, it’s this brilliant strategy of identifying kids who are really weak in math and increasing their hours of torture.

The best study of this approach, by Takako Nomi and Elaine Allensworth, examined the short-term impact of such a policy in the Chicago Public Schools (CPS), where double-dose algebra was implemented in 2003. …. Nomi and Allensworth reported no improvement in 9th-grade algebra failure rates as a result of this intervention, a disappointing result for CPS. The time frame of their study did not, however, allow them to explore longer-run outcomes of even greater importance to students, parents, and policymakers. (emphasis mine)

So double dose algebra didn’t work. Did that stop them? Hahahahah! Of course not! They just commissioned another study! One that would allow them to explore “outcomes of even greater importance” to students, like “will I make an extra $50,000 a year to compensate me for the time I spent in this tortuous hell?”

Using data that track students from 8th grade through college enrollment, we analyze the effect of this innovative policy by comparing the outcomes for students just above and just below the double-dose threshold. These two groups of students are nearly identical in terms of academic skills and other characteristics, but differ in the extent to which they were exposed to this new approach to algebra. Comparing the two groups thus provides unusually rigorous evidence on the policy’s impact.

Wait. You checked the kids just below and just above the threshold? So you only compared the strongest intervention students with the weakest regular students? Well, golly. Did you, perchance, check how the weakest regular students did compared to the weakest intervention students? Was it substantially different from the gap between the strongest intervention and weakest intervention?

The benefits of double-dose algebra were largest for students with decent math skills” but below-average reading skills, perhaps because the intervention focused on written expression of mathematical concepts.

Guys, half of all regular high school algebra students can’t add fractions or work with negative numbers—that is, they do not have decent math skills. So what the hell is relevant about progress made by intervention students with “decent math skills”?

With the new policy, CPS offered teachers of double-dose algebra two specific curricula called Agile Mind and Cognitive Tutor, stand-alone lesson plans they could use, and three professional development workshops each year, where teachers were given suggestions about how to take advantage of the extra instructional time.

Eight days of PD. EIGHT DAYS! In three plus years of teaching, I’ve taken 1.5 days off for being sick. In one year of teaching algebra and algebra intervention, I was required to leave the classroom for 8 days. The PD was utterly useless. The lunches with the other math teachers, good—lots of conversations, sharing of lessons, venting, and so on. We would do better to just give us money and an extra half hour every month for lunch.

CPS also strongly advised schools to schedule their algebra support courses in three specific ways. First, double-dose algebra students should have the same teacher for their two periods of algebra. Second, the two algebra periods should be offered consecutively. Third, double-dose students should take the algebra support class with the same students who are in their regular algebra class. Most schools followed these recommendations in the initial year. In the second year, schools began to object to the scheduling difficulties of assigning the same teacher to both periods, so CPS removed that recommendation.

It wasn’t just the schools that objected, I’m betting. I taught intervention the first year it was offered by the school. Of the three intervention teachers, one (a TFAer) turned in her resignation in January purely because she felt beaten down by intervention. Another teacher, an algebra specialist, a near-phlegmatically calm Type B, burst into tears when she met with the principal to make absolutely sure she wasn’t given an intervention class the next year.

I was the third. I never complained. I was under continual pressure because I wouldn’t tolerate three kids who were deliberately disrupting the class. The administration hinted I was racist, that I was exaggerating their behavior, and only relented on the pressure when my induction adviser witnessed a middling incident of blatant misbehavior and blew a gasket when the AVP of discipline shrugged it off until he learned that she’d seen it. Admin got a long letter from her and started to make the kids’ lives hell.

The following year, the school dropped the requirement for consecutive periods and allowed two teachers to split the course, rather than requiring the same teacher to do both sections. That same year, the intervention teachers got called into a room by the district and were given a blistering come to jesus meeting in which they were informed that their pass rates better go way, way up until they were as good as the pass rates from last year, which were clearly a goal to be attained. Of course, that last year, when they were dumping all that pressure on me, they never said “Yeah, too many referrals but hey, your pass rate is awesome. You’re only failing two kids, who never show up. Great job!”

This year, the school has dropped the requirement that the students all be in the same class. Hey. That sounds familiar, doesn’t it?

The pressure on the teachers is tremendous. So the schools try to find a way to pay lip service to the method—we’re offering intervention for our weak students!—without all their teachers quitting on them simultaneously. Intervention is brutal on teachers.

The recommendation that students take the two classes with the same set of peers increased tracking by skill level. All of these factors were likely to, if anything, improve student outcomes. We will also show, however, that the increased tracking by skill placed double-dose students among substantially lower-skilled classmates than non-double-dose students, which could have hurt student outcomes.

In addition to the strain on teachers, intervention is a huge hassle for administration and has an unintended consequence that escapes the notice of people who haven’t talked to an AVP responsible for the master schedule. But the reality is that a group of kids who must take two classes back to back end up taking most, if not all, of their classes together.

Say a school has 10 freshmen English classes but only three of them are double block remedial and (please note, this will come up again) many of the kids who take double block algebra also require double block English. The intervention freshmen are in periods 3 and 4 for algebra intervention, and the only other double block English class they can take is 5 and 6, leaving periods 1 and 2 open. Only one freshman PE class available in period 1, so science (bio or general) has to go in period 2. All done. So all the kids in algebra intervention periods 3 and 4 who are also in double block English take all their classes together. For every intervention class, some 20-30 underachieving, low incentive kids are moving through their entire day together, in non-remedial and remedial classes both. Of course, since most intervention kids are weak in all their subjects, this means that their classes have a disproportionately high number of low achievers—all of whom spend their entire day together, socializing. Or planning ways to wreak havoc. The troublemakers in my class arranged signals that they would use to disrupt classes—all their classes. They’d pick a code word, and whenever the teacher said that word, they’d all start laughing loudly, or squeaking their shoes, or sneezing.

I’m a big fan of tracking. I am vehemently opposed to taking a group of low achieving kids who are already buddies, already with next to no investment in school, already really annoyed at having to take a double dose of math—and give them every single class together, so they can reinforce each other in noncompliance and have an entire school day to socialize.

And then this section, which caused more flashbacks:

Overall, 55 percent of CPS students scored below the 50th percentile and thus should have been assigned to double-dose algebra, but only 42 percent were actually assigned to the support class. In addition, some students took double-dose algebra, even though they scored above the cutoff on the exam.

You’re thinking, wait. Some of the weak kids didn’t get intervention, and some of the strong kids did? That’s a weird fluke, isn’t it?

And so, another anecdote.

My strongest intervention kids had taken Algebra I the year before. Each of these six kids had scored higher on their state test than my average score for all my non-intervention algebra students. Yes, you read that right. Six of my intervention kids were good enough for the top half of my non-intervention algebra class. Not just better than my worst. Better than HALF the 100 students in my non-intervention classes. Two of them had actually achieved Basic on the previous year’s test. I got them to bring in their test scores and show them to the AVP of Instruction, demanding they be put into normal algebra (leaving me out of it, of course). One of them was put into my regular algebra class, and got an A-. The other four missed Basic by just a few points and despite my asking on their behalf, were required to take intervention. This despite the fact that I had over a dozen non-intervention freshmen who’d scored Below Basic or Far Below Basic. None of it mattered.

I was so foolish as to write the AVP of Instruction saying randomly, casually, something like “Hey, okay, so I can’t move strong students out. But so long as I’m teaching an intervention class for really really weak students, could I move some of my weak non-intervention class IN? Some of them are even Resource (sped) students, so they could substitute the intervention class for their guided studies class, so it wouldn’t create a scheduling disaster (sped kids get a study hall). Here’s a list.”

AVP of Instruction wrote back, in all caps, “THESE STUDENTS ARE SOPHOMORES. SOPHOMORES CAN’T TAKE INTERVENTION.”

Three weeks later, as God is my witness, the AVP of Instruction sends me a note, “I’m moving Fred McInery [not his real name] into your intervention class. He is a weak math student who needs more support.”

I look up Fred. He is a sophomore. I am very excited, because I am a moron, and send her a note. “Hey, great! We’re putting sophomores in intervention now? Could we revisit my list? I really think it will help these extremely weak students succeed in math.”

She writes back in all caps, “THESE STUDENTS ARE SOPHOMORES. SOPHOMORES CAN’T TAKE INTERVENTION.”

WE HAVE ALWAYS BEEN AT WAR WITH EASTASIA.

That story is an amusing flashback, even if it is crazy-making. Here is a horrible one:

We shall call her Denise. She is a doll. She was in my intervention class and had extremely weak skills and was the propaganda child for intervention, the one that everyone is thinking of when they propose it, because she worked her ass off and actually became better at math. She was not “just below the cutoff point”, either, but an FBB child who surreptitiously counted on her fingers to add 4 + 2. But conceptually, she got it. In the first semester, she did so poorly she was one of my contract students. She improved dramatically on her test, missing Basic by just a point. She had the third highest state test score of my intervention students, and passed my class.

Not only didn’t the school move her on to Geometry, but they put her into intervention again. (Yes. Now they had a sophomore intervention class.) When Denise told me this, I went quietly berserk and emailed the AVP of Instruction. It is not the same AVP. This one is worse.

Keep in mind, this second year at the same school, I am teaching Geometry, thank all the gods, and have two of my last year’s intervention kids taking my class even though they received slightly lower state test scores than Denise. Five others of my kids have also moved on to geometry with lower state test scores. Denise and three others were kept behind. I email the AVP of Instruction—a different one, as last year’s AVP has been promoted to principal of another school. This AVP is much worse–and point out these facts. I do not point out that I can discern no organizing principle behind this decision, that I suspect a very disorganized AV principal behind it. I am very polite; hey, this is just some oversight? Want to make sure it gets cleared up.

I write three notes, all very polite, and finally, a month after school starts, Denise gets moved….to a regular Algebra class. I gnash my teeth, but Denise is thrilled and thanks me profusely.

I see Denise at the year-end, and ask how she’s doing. “Great. But I failed the first semester of geometry, so I’ll have to go to summer school.”

“What? They put you in geometry?”

“Yeah, they said you advised it. But they didn’t move me until, like, November, so I failed. But that’s okay. I did good second semester, and I’m going to pass it over the summer.”

I guess it worked out okay, ultimately. But had she been put in the geometry class originally, she’d have had her summer.

Double-dosing had an immediate impact on student performance in algebra, increasing the proportion of students earning at least a B by 9.4 percentage points, or more than 65 percent. It did not have a significant impact on passing rates in 9th-grade algebra, however, or in geometry (usually taken the next year). Double-dosed students were, however, substantially more likely to pass trigonometry, a course typically taken in 11th grade. The mean GPA across all math courses taken after freshman year increased by 0.14 grade points on a 4.0 scale.

(emphasis mine)

Clearly, most students did not do all that well. As the study acknowledges, the low-achieving students did not benefit at all from the intervention; the students most likely to benefit were those who just missed the cutoff. More on that later.

Here’s what the study doesn’t make clear: many high school algebra students never make it to trig. They take it twice in high school, then take geometry twice. Or they take algebra once, geometry twice, and algebra 2 without trig (that’s the class I teach). So are they only counting the students who made it to trig?

The more meaningful stat would be the percentage of double-dosed kids who made it to trig vs. the non-double-dosed kids who achieved same. Reading this passage, the study appears to be saying that all the kids made it to trig and hahahahahaha, no. Not happening.

And since that’s not happening, then who, exactly, is being compared in the GPA? All the kids, or just the ones that made it to trigonometry? Presumably, just that set, because otherwise, the GPA number isn’t worth much. Hey, the double dose kids who flunked algebra twice and made through geometry by their senior year had a GPA .14 points higher than the single dose kids who made it through trig. Whoo and hoo.

It is important to note that many of these results are much stronger for students with weaker reading skills, as measured by their 8th-grade reading scores. For example, double-dosing raised the ACT scores of students with below-average reading scores by 0.22 standard deviations but raised above-average readers’ ACT scores by only 0.09 standard deviations. The overall impact of double-dosing on college enrollment is almost entirely due to its 13-percentage-point impact on below-average readers (see Figure 3). This unexpected pattern may reflect the intervention’s focus on reading and writing skills in the context of learning algebra.

(emphasis mine)

Oh, yes. That’s what we do in these algebra intervention classes. We focus on reading and writing! We’re given a bunch of kids who add 8 to 6 on their fingers, and we figure their struggle comes from not being able to read the word problems. So we put up a word wall and teach them five new terms and suddenly their reading skills skyrocket wildly.

Or—and this is just a wild, random, thought—perhaps my last school isn’t the only school in which Set A = {names of students taking double dose algebra} and Set B = {names of students taking double dose English} and is a Venn diagram in the two circles largely overlap?

You say oh, don’t be silly, ER. Of course they’d account for the possibility that the double dose algebra kids are also getting a double dose of reading intervention! And then not mention it! And I say, you don’t read much educational research, do you?

Because keep in mind the conclusion of this research:

As a whole, these results imply that the double-dose policy greatly improved freshman algebra grades for the higher-achieving double-dosed students, but had relatively little impact on passing rates for the lower-achieving students.

Apart from that, Mrs. Lincoln, how’d you like the friggin’ play?

Look. None of my outraged noise makes any sense at all if you don’t realize that, in the world of high school math, the kids who benefited, according to this study, kids achieving just below the passing standard, are WAY ABOVE AVERAGE for that population, particularly in a Title I school. Intervention exists because these schools have dozens, if not hundreds, of algebra students who have taken the course three times and still score Far Below Basic. It does not exist to help kids just below the 50% mark in math get better scores in reading, marginally higher grades and ACT scores, and better Trig scores—if they get to trig, which the normal intervention kid does not.

What people fondly imagine algebra intervention to do is this: kids are just a little behind, you know? They just need some extra time learning integer operations and fractions. They didn’t learn it the FIRST FIFTEEN TIMES they were taught it, so all they really need is another hour or so a day and they’ll be right up there with the rest of them, all right? And if they aren’t, well, it’s those damn teachers who just don’t want to work with “those kids”, and we’ll just have to find more teachers who really, really care about these kids who just need a few hours more help than the others. (Yes. This is the myth of “They’ve never been taught…..”)

Meanwhile, forty percent of the freshman class comes in having taken algebra once and scored far below basic or barely below basic, and are randomly assigned to double block or no double block using a dartboard, from what I can see. The teachers are dealing with the same lack of basic skills in both double and single block algebra, and rapidly realize (if they didn’t know already) that the kids who don’t know integer operations and fractions have this gap because they aren’t terribly bright. They can’t come up with an intervention vs. non-intervention approach, because some kids in the intervention class don’t need support while some kids in the non-intervention class do. But in the non-intervention classes, the teachers only have to deal with 3-8 kids with low skills, while in the intervention classes it’s 14-15 out of 20. So the only thing different about the intervention classes is monstrously bad behavior and more time in hell.

All this, mind you, so that we can do research that reveals no real improvement in outcomes.

But I’m out of it, baby. It’s enough to make me believe in god. Death to algebra intervention.


But the nightmares, they won’t stop until it’s destroyed!


Teaching Math vs. Doing Math

Justin Reich of EdWeek (not to be confused with Justin Baeder of EdWeek) wrote enthusiastically of a new study, asking What If Your Word Problems Knew What You Liked?:

Last week, Education Week ran an article about a recent study from Southern Methodist University showing that students performed better on algebra word problems when the problems tapped into their interests. …The researchers surveyed a group of students, identified some general categories of students’ interests (sports, music, art, video games, etc.), and then modified the word problems to align with those categories. So a problem about costs of of new home construction ($46.50/square foot) could be modified to be about a football game ($46.50/ticket) or the arts ($46.50/new yearbook). Researchers then randomly divided students into two groups, and they gave one group the regular problems while the other group of students received problems aligned to their interests.

The math was exactly the same, but the results weren’t. Students with personalized problems solved them faster and more accurately (emphasis mine), with the biggest gains going to the students with the most difficulty with the mathematics. The gains from the treatment group of students (those who got the personalized problems) persisted even after the personalization treatment ended, suggesting that students didn’t just do better solving the personalized problems, but they actually learned the math better.

Reich has it wrong. From the study:

Students in the experimental group who received personalization for Unit 6 had significantly higher performance within Unit 6, particularly on the most difficult concept in the unit, writing algebraic expressions (10% performance difference, p<.001). The effect of the treatment on expression-writing was significantly larger (p<.05) for students identified as struggling within the tutoring environment1 (22% performance difference). Performance differences favoring the experimental group for solving result and start unknowns did not reach significance (p=.089). In terms of overall efficiency, students in the experimental group obtained 1.88 correct answers per minute in Unit 6, while students in the control group obtained 1.56 correct answers per minute. Students in the experimental group also spent significantly less time (p<.01) writing algebraic expressions (8.6 second reduction). However, just because personalization made problems in Unit 6 easier for students to solve, does not necessary mean that students learned more from solving the personalized problems.

(bold emphasis mine)

and in the Significance section:

As a perceptual scaffold (Goldstone & Son, 2005), personalization allowed students to grasp the deeper, structural characteristics of story situations and then represent them symbolically, and retain this understanding with the support removed. This was evidenced by the transfer, performance, and efficiency effects being strongest for, or even limited to, algebraic expression-writing (even though other concepts, like solving start unknowns, were not near ceiling).

So the students who got personalized instruction did not demonstrate improved accuracy, at least to the same standard as they demonstrated improved ability to model.

I tweeted this as an observation and got into a mild debate with Michael Pershan, who runs a neat blog on math mistakes. Here’s the result:

I’m like oooh, I got snarked at! My own private definition of math!

But I hate having conversations on Twitter, and I probably should have just written a blog entry anyway.

Here’s my point:

Yes, personalizing the context enabled a greater degree of translation. But when did “translating word problems” become, as Michael Pershan puts it, “math”? Probably about 30 years old, back when we began trying to figure out why some kids weren’t doing as well in math as others were. We started noticing that word problems gave kids more difficulty than straight equations, so we start focusing a lot of time and energy on helping students translate word problems into equations—and once the problems are in equation form, the kids can solve them, no sweat!

Except, in this study, that didn’t happen. The kids did better at translating, but no better at solving. That strikes me as interesting, and clearly, the paper’s author also found it relevant.

Pershan chastised me, a tad snootily, for saying the kids “didn’t do better at math”. Translating math IS math. He cited the Common Core standards showing the importance of data modeling. Well, yeah. Go find a grandma and teach her eggsucking. I teach modeling as a fundamental in my algebra classes. It makes sense that Pershan would do this; he’s very much about the why and the how of math, and not as much about the what. Nothing wrong with this in a math teacher, and lord knows I do it as well.

But we shouldn’t confuse the distinction between teaching math and doing it. So I asked the following hypothetical: Suppose you have two groups of kids given a test on word problems. Group 1 translates each problem impeccably into an equation that is then solved incorrectly. Group 2 doesn’t bother with the equations but gives the correct answer to each problem.

Which group would you say was “better at math”?

I mean, really. Think like a real person, instead of a math teacher.

Many math teachers have forgotten that for most people, the point of math is to get the answer. Getting the answer used to be enough for math teachers, too, until kids stopped getting the answer with any reliability. Then we started pretending that the process was more important than the product. Progressives do this all the time: if you can’t explain how you did it, kid, you didn’t really do it. I know a number of math teachers who will give a higher grade to a student who shows his work and “thinking”, even if the answer is completely inaccurate, and give zero credit to a correct answer by a student who did the work in his head.

Not that any of this matters, really. Reich got it wrong. No big deal. The author of the study did not. She understood the difference between translating a word problem into an equation and getting the correct answer.

But Pershan’s objection—and, for that matter, the Common Core standards themselves—shows how far we’ve gone down the path of explaining failure over the past 30-40 years. We’ve moved from not caring how they defined the problem to grading them on how they defined the problem to creating standards so that now they are evaluated solely on how they define the problem. It’s crazy.

End rant.

Remember, though, we’re talking about the lowest ability kids here. Do they need models, or do they need to know how to find the right answer?