Handling the Teacher Perks

Before turning teacher, I spent all but five years as a temp worker, self-employed or contract. Unemployment? A hassle I didn’t bother with the few times I was eligible. Retirement? My very own funded SEP_IRA, no employer matching. Paid vacation and sick leave? Outside of those five years, I never had any.

Going from that life to public school teaching was kind of like Neal Stephenson’s description (excerpted from In the Beginning was the Command Line) of the guy who was raised by carpenters from early childhood with only a Hole Hawg as a drill and then meeting up finally with a puny homeowner’s version.

What the hell. With so much free stuff, how can you call this work?

From Veteran’s Day to the first week of the New Year, over three weeks off, the bulk of them from mid-December to early January. Five plus days off at spring break, and two months off in the summer. Eleven days of sick leave that accrue, and two “use it or lose it” days. I get the same amount of pay every single month. Guaranteed pension, already vested comfortably, probably to retire with 30%—not bad for a late entry.

Plus, I hear it’s hard to get fired.

I clearly remember watching the perks of corporate employment slowly be stripped away back in my twenties, perks that few people under 50 can even imagine. So it’s bizarre to have entered a profession where it feels like the 80s again.

Now, I’m wondering if I’m getting used to it.

In the previous five years of teaching, my collective time out of the classroom was 3 sick days and 6 mandated professional development days. This year, I was out of class for nearly 10 days of professional obligations: three days for an honest to god, out of state, education conference, two-plus days for mentoring and induction responsibilities, and 4 days of Common Core testing.

I felt very guilty about all this time off, and without question the absences impacted instruction time and coverage. So much so that when I came down with a really severe case of with food poisoning (you know those rotisserie chickens? Used to love them. Hope I eventually trust them again) during testing week, I came in anyway because I knew it would wreak havoc both on testing schedules for administration and my carefully scheduled coverage plans (I was missing alternate classes during the week). I went four days munching crackers and chugging that weird chalky pink stuff, previously unknown to me.

In retrospect this struck me as idiotic, so I went to the principal’s secretary and asked how to request time off. That’s when I learned formally I had 13 days a year, including two use or lose–which I’ve been losing for the past five years. I took a whole day and a half just for a family graduation 10 hours away, when I normally would have left Friday afternoon and come back Sunday night.

More evidence: for the first time in eight summers, six of them as a teacher, I decided to forego employment (part-time and no benefits, of course) at my favorite hagwon, where I usually act as chief lunatic for book club, PSAT prep, and occasionally geometry.

Why? I wanted more time off.

This wasn’t a sudden decision. Last year it finally sunk in that despite the easy hours and students, the elapsed time of my hagwon day clocked in at 9 hours: three on, three off, three on, for eight weeks. While this hadn’t seemed punitive with a 5 minute commute, the schedule lost much of its charm when I moved 45 minutes away. Meanwhile, the eight week schedule left just eight uninterrupted days off at the end of summer.

Yes. The four weeks I am granted throughout the year is not enough. I want more of the eight uninterrupted weeks. It shames me.

But there’s hope. If eight days seemed too little, two months off seemed….excessive. Years of temp work leaves me never entirely comfortable not knowing where my next dollar would come from. Long vacations make me nervous. Back in my tutor/test prep instructor life, my son and I took a long road trip one summer that culminated in a 6 week stay in another city. I notified a local Kaplan branch, got some SAT classes, put ads in Craigslist and got some private tutoring, making enough to offset the fuel and food expenditures for the trip.

I am not yet ready to abandon summer work altogether. I wanted a summer job. Just a different one, with a shorter work day, a shorter employment term, and higher hourly pay so I’d get more time off but the same dollars’ pay.

Normal people are thinking “Hah! And a pony.” Teachers are thinking “Duh. Just teach summer school.” Public summer school, that is. Six weeks at most in my area, higher hourly pay, out at 1:30.

I have very strong feelings about summer school, none of them positive. But public summer school it is, this summer. More of that later, assuming I can push through and finish this absurdly non-essential piece because family fun time and work are coming perilously close to giving me writer’s block.

As a side note, a transition marked: I’ve now left all three legs of my previous income behind. Private tutoring mostly gone over the past two years, the hagwon this last year, Kaplan since ed school.

A job change to get a longer summer break. Another worrisome trend?

But then, just when I began to worry about having been slowly sucked in, I learned what my preps for the upcoming year would be and nearly had a meltdown.

Every year, teachers are given a form to list their preferences for subject assignments (aka, “preps”). Every year, my form says “I’m happy to teach any academic subject I’ve got a credential for–but please don’t limit me to one prep a semester. Two is better, three is best.” Then I list three classes I haven’t taught in a while, or would like to do a second time. This year, I’d asked to teach at least one session of history, to build on my last year, pre-calc, which I hadn’t taught in a year, and any lower level class, just to keep myself humble. Again, this is in the context of teaching any other class as well.

I went into school after summer started to work on one of the professional obligations above, and as a thank-you, the principal showed us the master schedule board.

Semester One: Algebra 2, Trig. Two blocks of each.
Semester Two: Algebra 2, Trig. Three blocks total, two blocks Trig.

This schedule would be, to most teachers, a perk. Just two preps I’m familiar with. An easy year, after an extraordinarily demanding one in which I had two brand new classes, one of which was in a completely different academic subject for the first time in five years. Some might view the schedule as a form of thank-you, or maybe an acknowledgement that I’ve got more professional responsibilities so require a schedule with less planning or curriculum development.

I looked at the board and thought Christ, I have to quit this school, that’s awful, I love this school, but I have to get out of here. I need some time for job-hunting. I can’t quit summer school, it starts Monday. But I can jobhunt in the afternoons, it’s a Friday so I have some time to update my resume. Maybe I won’t have to leave the district, so I could keep tenure, and maybe I can talk to the administrator at summer school, hey, it’s actually good that I’m not at the hagwon this year, I just need to update my resume….

So not a perk, to me.

I tend towards extreme reactions, as alert readers may have noticed. Self-knowledge has led to compensatory braking systems. In years past, I would have just turned in my resignation on the spot. But my braking system kicked in, I remembered that quitting is just a symptom of my temporary worker mindset. I reminded myself how good it felt to get tenure, that my administration team likes me. Before I quit, I should perhaps consider other alternatives.

I will cover those alternatives, and my fears, in a follow-up post. No really, I promise.

So no, I’m not yet sucked in by the teacher perks. But I do want more free time during my 10 weeks off. Call me ungrateful.


Note: I will always value intellectual challenge over predictability for my own job satisfaction. But many teachers do an outstanding job teaching just one subject or the same two preps for thirty years. Outsiders, particularly well-educated folks with elite pedigrees, champion intellectual curious teachers with cognitive ability to spare as an obvious advancement over what they see as the “factory model” teacher turning out the same widgets ever year. But little evidence suggests that intellectual chops produces better results, much less better teachers. So please don’t interpret my rejection of predictability and routine as evidence of anything other than a fear of boredom.

Math isn’t Aspirin. Neither is Teaching.

First, congrats to Dan Meyer, who finished his doctorate at Stanford and just hired on as CAO for Desmos, a tremendously useful online graphing calculator. He persisted in the face of threatened failure, and didn’t give up even when he had an easy out into a great job. (Presumably Dan and most of the Math Twitter Blogosphere are still annoyed at my jeremiad about the meaning of his meteoric rise, in which Dan played the part of illustration.)

Dan has asked math teachers for ways to create “headaches” for which math can be considered aspirin:


And this interested me because the request completely, perfectly, captures the difference between our two philosophies, which I also wrote about a couple years ago:


The comparison is an instructive one, I think. Both of us find it necessary to build our own curriculum, rejecting the one on offer, and both of us, I think, tremendously enjoy the creation process. Both of us reject the typical didactic contract described by Guy Brouseau, setting expectations very different from those of typical math teachers: explain, work a few examples, assign a set. Both of us largely eschew textbooks for instruction, although I consider them completely unnecessary save as reference books that often provide interesting problems I can steal, while Dan dreams of the perfect digital textbook.

And yet we couldn’t differ more in both teaching philosophy and curriculum approach.

Dan’s still selling curiosity and desire for knowledge, assuming capability will follow. I’m still selling capability because I see confidence follow.

Dan still believes that student engagement captures their curiosity which leads to academic success, that the Holy Grail of academic success in math lies in finding the perfect problems that universally stimulate interest in finding answers, which leads to understanding for all. I hold that student engagement leads to their willingness to attempt what they previously thought was impossible but that the Holy Grail doesn’t exist.

Meyer thinks teachers skeptical of his methods are resistant to change and the best interests of their students. I advise teachers and recommend curriculum; if they find my advice helpful, great. I encourage them to modify or even reject my advice, to continue to see an approach that works for them and their students.

Dan wants to be “less helpful”. I want to teach kids to use their own resources, but given a kid who wants to give up, I’m offering help every time.

Meyer’s methods would probably need tremendous readjustment if he worked in a low-income school with a wide range of abilities. I’d probably be much “less helpful” if I taught at a school with a high-achieving, homogenous population obsessed about grades.

Meyer rose quickly in the rarefied world of rock star teachers. I aspire to the role of and indie with cult status.

Dan’s query: “Why did mathematicians think this skill was worth even a little bit of our time? If the ability to factor that trinomial is aspirin for a mathematician, then how do we create the headache?

My answer: You can’t.

The commenters, mostly teachers, took the question seriously, understanding that it was another way of looking at the students’ demand, “When will we use this?”. Answering this question clearly troubles most of the commenters—or they have an affirmative answer they’re satisfied with.

My answer to the student demand: “Probably never. But the more willing you are to take on challenging tasks you learn from, the more opportunities you’ll have in life, both professional and personal. Call me crazy, but I see this as a good thing.”

Dan Meyer is wrong, I believe, in looking for the Holy Grail that makes math “aspirin”1. But that’s not the point of my running through the Dan vs. Ed showdown.

Instead, consider the comparison yet another data point in my slowly developing thesis that ed schools need more flexibility and even less prescription. Few people understand the vast scale of values, philosophies, management and curriculum found in the teaching population.

Two teachers developing uncommon curriculum who agree on very little—yet both of us are considered successful teachers. (one has much more success selling his ideas to people with money, I grant you.) Take ten more math teachers likewise who build their own curriculum, have their own takes on philosophy, discipline, and even grading and they’re unlikely to change to suit another model. Take 100 more–ditto. Voila! an expanding population of teachers who have successful teaching approaches and curriculum design that they’ve developed and modified. None of them are going to agree on much. They have come to widely varying conclusions that they will continue to develop and enhance on their timeline as they see fit. No one will have anything approaching a convincing argument that could possibly convince them otherwise.

The point: the current push to “fix” ed schools, a fond delusion of reformers, progressives and union leaders alike. People as diverse as Benjamin Riley, Paul Bruno, Rick Hess and others believe we can find (or already have) a teaching knowledge base that can be passed on to novices.

Teachers are never going to agree.

Agreement or even consensus is impossible. Teachers and students form infinite combinations of interests, values, incentives and unlike reformers, teachers are going to value their experience and unique circumstances over anything ed schools tried to pretend was the only way.

Teaching, like math, isn’t aspirin. It’s not medicine. It’s not a cure. It is an art enhanced by skills appropriate to the situation and medium, that will achieve all outcomes including success and failure based on complex interactions between the teachers and their audience. Treat it as a medicine, mandate a particular course of treatment, and hundreds of thousands of teachers will simply refuse to comply because it won’t cure the challenges and opportunities they face.

So when the status quo has prevailed for the next 30 years, don’t say you weren’t warned.

1which isn’t to say I don’t plan on writing up the how and why of my quadratic equations section.

Grant Wiggins

Curriculum is the least understood of the reform efforts, even though parents have more day to day contact with curriculum than choice or accountability. This is in large part because curriculum advocates don’t agree to the degree that accountability and choice reformers do, but also because teachers have far more control over curriculum than most understand. As Larry Cuban explains, curriculum has multiple layers: intended, tested, taught, and learned. Curriculum battles usually involve the intended curriculum, the one designed by the state, which usually creates the tested curriculum as a manageable subset. (Much of the Common Core controversy is caused by the overwhelming difficulty of the tested curriculum, but leave that for another time.)

But intended and tested curriculum are irrelevant once the doors close, and in this essay, I refer to the taught curriculum, the one that we teachers sculpt, whether we use “the book” (actually just pieces of the district approved book), use another book we like better, or build our own.

To the extent most non-educators know anything about curriculum advocacy, it begins and ends with E. D. Hirsch, otherwise known as “the guy who says what my nth grader should know”, author of a book series he eventually transformed into a curriculum for k-6, Core Knowledge. Hirsch offers one Big Idea: improving student background knowledge will improve their reading comprehension, because only with background knowledge can students learn from text. But, the Idea continues, schools ignore content knowledge in favor of teaching students “skills”. To improve reading comprehension and ongoing student academic outcomes, schools must shift from a skills approach to one dedicated to improving knowledge.

Then there’s Grant Wiggins, whose death last week occasioned this essay as an attempt to explain that we’ve lost a giant.

The media proper didn’t give Wiggins’ passing much notice. Valerie Strauss gave his last blog sequence a good sendoff and Edutopia brought back all their interviews with him. Education World and Education Week gave him obits. It doesn’t look as if Real Clear Education noted his passing, which is a bit shocking but perhaps I missed the mention.

Inside education schools, that world reformers hold in considerable contempt, Wiggins’ work is incredibly influential and his death sent off shockwaves. Since 1998, Understanding by Design has been an essential component in preparing teachers for the professional challenge of deciding what to teach and how to deliver the instruction.

Prospective teachers don’t always understand this preparation will have relevance to their lives until their first year in the classroom. Progressive ed schools would never say anything so directly as “You will be faced with 30 kids with an 8 year range in ability and the textbooks won’t work.” Their ideology demands they wrap this message up in hooha on how insensitive textbooks are to the diverse needs of the classroom. Then, their ideology influences the examples and tasks they choose for instruction. Teacher candidates with an instructivist bent thus often tune out curriculum development classes in ed school, rolling their eyes at the absurd examples and thinking keerist, just use the textbook. (Yeah. This was me.)

Usually, they figure out the relevance of curriculum instruction when they get into the classroom, when they realize how laughably inadequate the textbook is for the wide range of abilities and interests of their students. When they realize the book assumes kids will sit patiently and listen, then obediently practice. When they realize that most of the kids won’t bring their books, and that all the well-intended advice about giving consequences for unprepared students will alone result in failing half the class, never mind the problems with their ability. When they realize that many kids have checked out, either actively misbehaving or passively sitting. Worst of all the teachers experience the kids who are eager to learn, try hard, don’t get it, and don’t remember anyway. Then, even after they make a bunch of adjustments, these teachers realize that kids who do seem to be learning don’t remember much—that is, in Cuban’s paradigm, the learned curriculum is wildly different than the one taught (or in the Wiggins universe, “transferred”).

The teachers who don’t quit or move to charters or comprehensives with a higher SES may remember vaguely hey, there was something about this in ed school (hell, maybe that’s just me). So they go dig up their readers and textbooks and suddenly, all the twaddle about diversity and cultural imperialism fades away and the real message becomes legible, like developing invisible ink. How do you create a learning unit? What are your objectives? How will you assess student learning? And at that point, many roads lead to Wiggins.

Grant Wiggins was impossible to pigeonhole in a reform typology. In 1988, he made 10 proposals for high school reform that leaned progressive but that everyone could find some agreement with. He didn’t think much of lecturing, but he wrote a really terrific analysis of lectures that should be required reading for all teachers. (While I also liked Harry Webb’s rejoinder, I reread them in preparation for this essay and Grant’s is far superior.) He approved of Common Core’s ELA standards, but found the math ones weak. In the space of two weeks in 2013, he took on both Diane Ravitch and E. D. Hirsch, and this is after Ravitch flipped on Hirsch and other traditionalists.

Grant Wiggins was more than ready to mix it up. Both his essays on Hirsch and Ravitch might fairly be called broadsides, although backed with research and logic that made both compelling, (perhaps that’s because I largely agreed with them). His last two posts dissected Hirsch supporter Dan Willingham’s op-ed on reading strategies. While he listened and watched teachers intently, he would readily disagree with them and was rarely gentle in pointing it out. I found his insights on curriculum and instruction absolutely fascinating, but rolled my eyes hard at his more excessive plaints on behalf of students, like the nonsense on apartheid bathrooms and the shadowing experience that supposedly revealed the terrible lives of high school students—and if teachers were all denied the right to sarcasm, as he would have it, I’d quit. He didn’t hesitate to say I didn’t understand the lives that students lead, and I told him right back that he was wrong. More troubling to me was his conviction that most teachers were derelict in their duty and his belief that teachers are responsible for low test scores. But what made him so compelling, I think, is that he offered value to all teachers on a wide range of topics near to our needs, whether or not we shared all his opinions.

I knew him slightly. He once linked to my essay on math philosophies as an example of a “learned” teacher, and read my extended response (do I have any other kind?) and took the time to answer. Then, a few months later, I responded to his post on “teacher job descriptions” with a comment he found worthy of pulling out for a post on planning. He then privately emailed to let me know he’d used my comment and asked me to give feedback on his survey. That was a very big day. Like, I told my folks about it.

In the last week of his life, Grant had asked Robert Pondiscio to read his Willingham critique. Pondiscio, a passionate advocate of all things content knowledge, dismissed this overture and declared his posts on both Willingham and Hirsch “intemperate”. Benjamin Riley of Deans for Impact broke in, complimenting Grant and encouraging the idea of debate. The next day, Daniel Willingham responded to Grant on his site (I would be unsurprised to learn that Riley had something to do with that, and kudos to him if so). Grant was clearly pleased to be hashing the issues out directly and they exchanged a series of comments.

I had been retweeting the conversation and adding comments. Grant agreed with my observation that Core Knowledge advocates are (wrongly) treated as neutral experts.

On the last day of his life, Grant favorited a few of these tweets, I think because he realized I understood both his frustration at the silence and his delight at finally engaging Dan in debate.

And then Grant Wiggins died suddenly, shockingly. He’ll will never finish that conversation with Dan Willingham. Death, clearly, has no respect for the demands of social media discourse.

Dan Willingham tweeted his respect. Robert Pondiscio wrote an appreciation, expressing regret for his abruptness. If the general media ignored Grant’s passing, Twitter did not.

I didn’t know Grant well enough to provide personal insights. But I’m an educator, and so I will try to educate people, make them aware of who was lost, and what he had to offer.

Novices can find plenty of vidoes on his “backwards design” with a simple google. But his discussions on learning and assessment are probably more interesting to the general audience and teachers alike—and my favorites as well.

Reformers like Michael Petrilli are experiencing a significant backlash to their causes. Petrilli isn’t wrong about the need for parent buy-in, but as Rick Hess recently wrote, the talkers in education policy are simply uninterested in what the “doers” have to offer the conversation.

Amen to that. The best education policy advocates—Wiggins, Larry Cuban, Tom Loveless–have all spent significant time as teachers. Grant Wiggins set an example reformers could follow as someone who could criticize teachers, rightly or wrongly, and be heard because he listened. If he disagreed, he’d either cite evidence or argue values. So while he genuinely believed that most teachers were inadequate, teachers who engaged with him instantly knew this guy understood their world, and were more likely to listen.

And for the teachers that Grant found inadequate—well, I will always think him in error about the responsibility teachers own for academic outcomes. But teachers should stretch and challenge themselves. I encourage all teachers to look for ways to increase engagement, rigor, and learning, and I can think of no better starting point than Grant Wiggins’ blog.

I will honor his memory by reading his work regularly and looking for new insights to bring to both my teaching and writing.

If there’s an afterlife, I’m sure Grant is currently explaining to God how the world would have turned out better if he’d had started with the assessment and worked backwards. It would have taken longer than seven days, though.

My sincere condolences to his wife, four children, two grandsons, his long-time colleague Jay McTighe, his band the Hazbins, and the many people who were privileged to know him well. But even out here on the outskirts of Grant’s galaxy of influence, he’ll be sorely missed.

Functions vs. Equations: f(x) is y and more

I wanted to talk about function algebra, which naturally would include a reference to function notation.

So here’s the frustrating thing about writing this blog. I try to include links to other sites that explain a concept, so that I don’t have to reinvent the wheel for my reading audience. But a google gives me these results: useless links that do little more than say “f(x) is the same as y”. That’s not math. That’s test prep. And there’s nothing wrong with test prep, but every one of these sites purport to be math teaching sites, and hey, I’m not a mathematician, but shouldn’t we be explaining what f(x) means?

Someone somewhere is saying “See, this is why we need teachers to be math majors, instead of English majors who get 800 on the GRE quant section. You can’t substitute math understanding that comes with the study of these important principles.” That someone somewhere is wrong. I used to think that in my early days, until I had too many conversations like this:

Me, to AP Calculus teacher WHO MAJORED IN MATH: Hey, what do you tell your kids about function notation?

AP Calculus teacher WHO MAJORED IN MATH: f(x) is the same as y.

Me, nonplused: Well. Yeah. But I mean about why we developed function notation, what it serves that can’t be served by….

AP Calculus teacher WHO MAJORED IN MATH: It’s just notation. Don’t be confused.

Me: I’m not confused. But they serve different purposes, and I’m just trying to be sure I accurately capture…

AP Calculus teacher WHO MAJORED IN MATH: They don’t serve different purposes. It’s just notation f(x) is the same as y.

Me: Ok.

In my experience, very few math teachers WHO ACTUALLY MAJORED IN MATH care about these things either. My beer drinking buddy is an exception (and he’s now department head), and he’s the only math teacher I’ve found so far who was interested in my work on this subject.

Textbooks? McDougall Litell, CPM has a lot of those function machines. But no explanation. Holt does a little better but I didn’t understand that until I understood what I was looking for.

So I spend more time looking for a good link. Otherwise, I have to spend a lot of time figuring out how to explain function notation accurately, or at least inoffensively, so that people reading this blog don’t make me remind them that, for chrissakes, I’m an English major not a mathematician! That takes time. It’s not time I wanted to spend. I don’t want to tell you what function notation is, in a way that will pass expert muster. I want to tell how I build on function notation to teach function algebra. But I can’t do that well without explaining function notation, which I didn’t set out to do. This leads to many blog entries taking much more time than they should. The original intent for my function algebra post was to be just a quick little throwaway.

I began writing this post nearly a month ago, and got stalled looking for a way to characterize the explanation. You may be wondering why I would explain something I don’t understand—but that’s not it, really. I just don’t know what to call it. And that’s fine for teaching, not so much for writing, and so I spend hours trying to figure out the correct query. Which took me, literally, up until today.

Just fifteen minutes ago (as I write this sentence) I finally found the kernel in this discussion on function notation before Euler, in which someone writes:

but [Newton] refers to these as equations, not functions, and admittedly (written the way they are) that is exactly what they are. It seems anything that we would today write as a function, Newton described in words, such as:

HA. I learned something I hadn’t quite understood completely before–a function and an equation are not the same thing. Googling “what is the difference between an equation and a function” led me to the right websites. I realize now that I wasn’t just looking for an explanation of function notation, but rather why and when we use functions vs. equations.

Here’s an explanation that covers what I was trying to say.

So my research paid off. In practice, what I’ve been doing in this lesson is introducing function operations and function notation as a way to overcome a constraint in using equations.


Sami needs $15 more to buy the new hoodie that he wants. But if Sami skips the hoodie, he needs just three more dollars to buy a ticket to the pizza feed on Friday. If Sami has x dollars, how much money, in terms of x, does Sami need if he wants both the hoodie and the ticket to the pizza feed?

The first thing the kids think is that Sami needs $18 more.

I say okay, Sami has $20. How much does the hoodie cost? $35. How much does the pizza feed cost? $23. How much ….oh. Huh, say the kids. He needs a lot more than $18.

Depending on how goofy I feel, I might get out some fake money. I count out $20, give it to a quiet student. How much more for the hoodie? Count out another $15. Now how about the…Right about then, a student gets it: you need the $20 twice.

So then we go to the board and model the two different equations for each purchase.


So if we are getting both things, what are we doing? Adding, the class choruses.

Ah, now there’s a new wrinkle. The kids have been adding equations for a while now, in systems. So I say, let’s try to add these equations.


Is that right? We test it with $20 and the kids realize that the right side “works” (that is, we get $68) but the left side says we still need to divide by 2, which would be…wrong.

“So what’s happening is that we are running into the limits of an equation. An equation tells us that two expressions occupy the same point on a number line–that is, after all, what “equal” means.”

“But when we use multiple variables in equations, then the equation becomes a relationship between two variables, an if-then. If y=x + 15, then the point (3, 18) is a solution because setting x=3 and y=18 creates an equation that has both sides occupying the same point on the number line. If 3x + 2y=12, then (2,3) is a solution because setting x=2 and y=3, etc.”

But in an equation, the variables are values. So in the Sami case, we can’t treat y as a collection point. We can’t keep track of the dependent variable because it varies, obviously. The y in the first equation has a different value from the y in the second equation. If we wanted to keep them separate, we could use two different variables, like z = x + 15 and y = x + 3. Or we could number the ys: y1 = x+15, y2 = x+3.

“Using the language of functions makes a lot of these constraints disappear.”

“First, logically. Functions are different in a key way from equations: a function is an output. An equation is a relationship between variables. Yes, y=x+3 and f(x)= x+3 yield the same results, which is why we teachers always tell you to remember that ‘y and f(x) are the same thing’. However f(x) isn’t a variable, but an output. So when we add two functions, we’re adding outputs. Remember, too, that a function doesn’t even have to be an equation, like in the cell phone code example.

Then there’s function notation, invented by Euler. Function notation enables unique names, usually a single letter. But it doesn’t have to be. You can get creative with the letter names and the input values.”

“Function notation is just more elegant and efficient, too. Instead of saying ‘if x=7′ you can just say f(3). Once you define the function named ‘f’, anything can be input, even another expression, like f(a+7). And then, instead of saying ‘y=’ and solving for x, write f(x)= 3.”

“So let’s call Sammy’s cash on hand c, and then create a function h for hoodie, and p for pizza feed.

h(c) = c+15
p(c) = c+3

In both cases, c represents the money Sami has, so the input value is the same. But the output value varies based on the function used.”

“Now, this is a small difference. But how many have you been told that f(x) is the same as y?” Bunch of hands raised.

“Yep. And in a lot of ways, it is. But you have to be wondering why, if they’re the same thing, we bother teaching you about function notation.” Lots of nods.

“So as you move on into advanced math, you’ll start to learn other reasons why we sometimes use functions and other times use equations. For now, it’s enough to know that function notation allows us to keep track of our different outcomes.

“Once we can do this, we can actually create an entire math with functions. They can be added, subtracted, multiplied. They have inverse operations.”

“But then why do we use equations?”

“Well, for one thing, functions don’t do systems well. Remember, when we solve systems, we are expecting both the x and the y (and any other variables) to be equal. Functions don’t handle that well. So you’ll see that we switch back and forth between equations and functions as needed.”

When you need to add expressions, functions are great. So now we can add h(c) and g(c).

h(c) + p(c) = (x + 15) + (x + 3) = 2x + 18

“Because we are adding outcomes, and have a unique way of tracking each outcome, we can add them properly. Remember, too, that since a function doesn’t need to be an equation, I can add or subtract outcomes without even having an equation. If a(x) = 9 and b(y) = 17, then b(y) – a(x) is 8, and I don’t have to care if a(x) and b(y) are generated by an expression or a rule or a code or a random happenstance—provided, of course, that random happenstance is only one per input.”


I know. You’re wondering why I don’t just follow the AP Calculus teacher’s “f(x) is the same as y”. Well, it turns out that function operations are a big part of pre-calc, so they’ll use this later.

In the meantime, I give them some practice with function notation (I stole this at random). Not enough. Kids don’t really know it later. But at least they’re exposed to it.

Then I go on to linear function addition and subtraction. I usually just put problems on the board.

Sample quiz:


Here’s a test question:


And from here I go on to linear function multiplication (aka quadratics) and, eventually, rational expressions (linear function division).

Like teaching congruence with isometries, I can’t argue that using functions to further our work in linear and quadratic equations is better. I find it more…elegant, maybe?

But the execution isn’t quite there. This is the first year I’ve really taught this whole sequence: introducing functions, function addition/subtraction/notation, function multiplication, inverse functions, rational expressions. Writing it up has revealed an obvious improvement. Up to now, my function illustration has been a quick standalone lesson. Then later I introduce the notion of function addition and in doing so, bring up function notation.

This is goofy, now that I look at it. In the future, I’ll introduce functions and then go into function notation. I can spend a day or two on that, quiz that early. Then I can go back into linear equations or inequalities (the placement is flexible) and then bring up function addition and subtraction, with function notation already covered.

You know what’s irritating? The huge effort described at the beginning of this post to figure out how to describe what I was teaching led me to this. The huge effort underwent solely in order to write this post. Which I was griping about. In learning how to describe function notation for my readers, I learned that the proper way to characterize my work is as a difference between functions and equations, and that led to an idea for better sequencing.

This is kind of a placeholder post. Obviously, I’m in flux about this right now. My linear equations unit has been in good shape for a while. This gives me plenty of room to add flourishes, introduce more complicated topics onto a subject the students know well. Meanwhile, linear function multiplication has proven to be a great introduction to quadratics. So now I’m involved in putting it all together.

Next up in this sequence: the post that I really wanted to write, on my quadratics introduction.

Sorry for the slow rate of posts lately. I did five in April, then got lazy.

The Day of Three Miracles

I often hook illustrative anecdotes into essays making a larger point. But this anecdote has so many applications that I’m just going to put it out there in its pure form.

A colleague who I’ll call Chuck is pushing the math department to set a department goal. Chuck is in the process of upgrading our algebra 1 classes, and his efforts were really improving outcomes for mid to high ability levels, although the failure rates were a tad terrifying. He has been worried for a while that the successful algebra kids would be let down by subsequent math teachers who would hold his kids to lower standards.

“If we set ourselves the goal of getting one kid from freshman algebra all the way through to pass AP Calculus, we’ll improve instruction for everyone.” (Note: while the usual school year doesn’t allow enough time, our “4×4 full-metal block” schedule makes it possible for a dedicated kid to take a double year of math if he chooses).

Chuck isn’t pushing this goal for the sake of that one kid, as he pointed out in a recent meeting. “If we are all thinking about the kid who might make it to calculus, we’ll all be focused on keeping standards high, on making sure that we are teaching the class that will prepare that kid–if he exists–to pass AP Calculus.”

I debated internally, then spoke up. “I think the best way to evaluate your proposal is by considering a second, incompatible objective. Instead of trying to prepare every kid who starts out behind as if he can get to calculus, we could try to improve the math outcomes for the maximum number of students.”

“What do you mean?”

“We could look at our historical math completion patterns for entering freshmen algebra students, and try to improve on those outcomes. Suppose that a quarter of our freshmen take algebra. Of those students, 10% make it to pre-calc or higher. 30% make it to trigonometry, 50% make it to algebra 2, and the other 10% make it to geometry or less. And we set ourselves the goal of reducing the percentages of students who get no further than geometry or even, ideally, algebra 2, while increasing the percentages of kids who make it into trigonometry and pre-calc by senior year.”

“That’s what will happen with my proposal, too.”

“No. You want us to set standards higher, to ensure that kids getting through each course are only those qualified enough to go to Calculus and pass the AP test. That’s a small group anyway, and while you’re more sanguine than I am about the efficacy of instruction on academic outcomes, I think you’ll agree that a large chunk of kids simply won’t be the right combination of interested and capable to go all the way through.”

“Yes, exactly. But we can teach our classes as if they are.”

“Which means we’ll lose a whole bunch of kids who might be convinced to try harder to pass advanced math classes that weren’t taught as if the only objective was to pass calculus. Thus those kids won’t try, and our overall failure rate will increase. This will lower math completion outcomes.”

Chuck waved this away. “I don’t think you understand what I’m saying. There’s nothing incompatible about increasing math completion and setting standards high enough to get kids from algebra to calculus. We can do both.”

I opened my mouth…and decided against further discussion. I’d made my point. Half the department probably agreed with me. So I decided not to argue. No, really. It was, like, a miracle.

Chuck asked us all to think about committing to this instruction model.

Later that day, I ran into Chuck in the copyroom, and lo, a second miracle took place.

“Hey,” he said. “I just realized you were right. We can’t have both. If we get the lowest ability kids motivated just to try, we have to have a C to offer them, and that lowers the standard for a C, which ripples on up. We can’t keep kids working for the highest quality of A if we lower the standards for failure.”

Both copiers were working. That’s three.


I do not discuss my colleagues to trash them, and if this story in any way reflects negatively on Chuck it’s not intentional. Quite the contrary, in fact. Chuck took less than a day to grasp my point and realized his goal was impossible. We couldn’t enforce higher standards in advanced math without dooming far more kids to failure, which would never be tolerated.

Thus the two of us collapsed a typical reform cycle to six hours from the ten years our country normally takes to abandon a well-meant but impossible chimera.

Many of my readers will understand the larger point implicitly. For those wondering why I chose to tell this story now, I offer up Marc Tucker, whose twopart epic on American education’s purported failures illustrates everything that’s wrong with educational thinking today. I would have normally gone into greater detail enumerating the flaws in reasoning, facts, and ambition but that’s a lot of work and this is a damn good anecdote.

Some other work of mine that strikes me as related:

I think I’ve written about my suggested solution somewhere, but where…(rummages)….oh, yes. Here it is: Philip Dick, Preschool and Schrödinger’s Cat–the last few paragraphs.

“Reality is that which, when you stop believing in it, doesn’t go away.”

When everyone finally accepts reality, we can start crafting an educational policy that will actually improve on our current system, which does a much better job than most people understand.

But that’s a miracle for another day.

Evaluating the New PSAT: Math

Well, after the high drama of writing, the math section is pretty tame. Except the whole oh, my god, are they serious? part. Caveat: I’m assuming that the SAT is still a harder version of the PSAT, and that this is a representative test.

Metric Old SAT Old PSAT ACT New PSAT
44 MC, 10 grid
28 MC, 10 grid
60 MC 
40 MC, 8 grid

1: 20 q, 25 m 
2: 18 q, 25 m 
3: 16 q, 20 m
1: 20 q, 25 m 
2: 18 q, 25 m
1: 60 q, 60 m 
NC: 17 q, 25 m 
Calc: 31 q, 45 m
1: 1.25 mpq 
2: 1.38 mpq
3: 1.25 mpq
1: 1.25 mpq 
2: 1.38 mpq
1 mpq 
NC: 1.47 mpq 
Calc: 1.45 mpq

Number Operations 
Algebra & Functions
Geometry & Measurement
Data & Statistics


elem & intermed.
coord & plane
1) Heart of Algebra 
2) Passport to
Advanced Math
3) Probability &
4) Data Analysis
Additional Topics
in math

It’s going to take me a while to fully process the math section. For my first go-round, I thought I’d point out the instant takeaways, and then discuss the math questions that are going to make any SAT expert sit up and take notice.

The SAT and PSAT always gave an average of 1.25 minutes for multiple choice question sections. On the 18 question section that has 10 grid-ins, giving 1.25 minutes for the 8 multiple choice questions leaves 1.5 minutes for each grid in.

That same conversion doesn’t work on the new PSAT. However, both sections have exactly 4 grid-ins, which makes a nifty linear system. Here you go, boys and girls, check my work.

The math section that doesn’t allow a calculator has 13 multiple choice questions and 4 grid-ins, and a time limit of 25 minutes. The calculator math section has 27 multiple choice questions and 4 grid-ins, and a time limit of 45 minutes.

13x + 4y = 1500
27x + 4y = 2700

Flip them around and subtract for
14x = 1200
x = 85.714 seconds, or 1.42857 minutes. Let’s round it up to 14.3
y = 96.428 seconds, or 1.607 minutes, which I shall round down to 1.6 minutes.

If–and this is a big if–the test is using a fixed average time for multiple choice and another for grid-ins, then each multiple choice question is getting a 14.4% boost in time, and each grid-in a 7% boost. But the test may be using an entirely different parameter.

Question Organization

In the old SAT and ACT, the questions move from easier to more difficult. The SAT and PSAT difficulty level resets for the grid-in questions. The new PSAT does not organize the problems by difficulty. Easy problems (there are only 4) are more likely to be at the beginning, but they are interlaced with medium difficulty problems. I saw only two Hard problems in the non-calculator section, both near but not at the end. The Hard problems in the calculator section are tossed throughout the second half, with the first one showing up at 15. However, the coding is inexplicable, as I’ll discuss later.

As nearly everyone has mentioned, any evaluation of the questions in the new test doesn’t lead to an easy distinction between “no calc” and “calc”. I didn’t use a calculator more than two or three times at any point in the test. However, the College Board may have knowledge about what questions kids can game with a good calculator. I know that the SAT Math 2c test is a fifteen minute endeavor if you get a series of TI-84 programs. (Note: Not a 15 minute endeavor to get the programs, but a 15 minute endeavor to take the test. And get an 800. Which is my theory as to why the results are so skewed towards 800.) So there may be a good organizing principle behind this breakdown.

That said, I’m doubtful. The only trig question on the test is categorized as “hard”. But the question is simplicity itself if the student knows any right triangle trigonometry, which is taught in geometry. But for students who don’t know any trigonometry, will a calculator help? If the answer is “no”, then why is it in this section? Worse, what if the answer is “yes”? Do not underestimate the ability of people who turned the Math 2c into a 15 minute plug and play to come up with programs to automate checks for this sort of thing.


Geometry has disappeared. Not just from the categories, either. The geometry formula box has been expanded considerably.

There are only three plane geometry questions on the test. One was actually an algebra question using the perimeter formula Another is a variation question using a trapezoid’s area. Interestingly, neither rectangle perimeter nor trapezoid formula were provided. (To reinforce an earlier point, both of these questions were in the calculator section. I don’t know why; they’re both pure algebra.)

The last geometry question really involves ratios; I simply picked the multiple choice answer that had 7 as a factor.

I could only find one coordinate geometry question, barely. Most of the other xy plane questions were analytic geometry, rather than the basic skills that you usually see regarding midpoint and distance–both of which were completely absent. Nothing on the Pythagorean Theorem, either. Freaky deaky weird.

When I wrote about the Common Core math standards, I mentioned that most of geometry had been pushed down into seventh and eighth grade. In theory, anyway. Apparently the College Board thinks that testing geometry will be too basic for a test on college-level math? Don’t know.

Don’t you love the categories? You can see which ones the makers cared about. Heart of Algebra. Passport to Advanced Math! Meanwhile, geometry and the one trig question are stuck under “Additional Topic in Math”. As opposed to the “Additional Topic in History”, I guess.

Degree of Difficulty;

I worked the new PSAT test while sitting at a Starbucks. Missed three on the no-calculator section, but two of them were careless errors due to clatter and haste. In one case I flipped a negative in a problem I didn’t even bother to write down, in the other I missed a unit conversion (have I mentioned before how measurement issues are the obsessions of petty little minds?)

The one I actually missed was a function notation problem. I’m not fully versed in function algebra and I hadn’t really thought this one through. I think I’ve seen it before on the SAT Math 2c test, which I haven’t looked at in years. Takeaway— if I’m weak on that, so are a lot of kids. I didn’t miss any on the calculator section, and I rarely used a calculator.

But oh, my lord, the problems. They aren’t just difficult. The original, pre-2005 SAT had a lot of tough questions. But those questions relied on logic and intelligence—that is, they sought out aptitude. So a classic “diamond in the rough” who hadn’t had access to advanced math could still score quite well. Meanwhile, on both the pre and post 2005 tests, kids who weren’t terribly advanced in either ability or transcript faced a test that had plenty of familiar material, with or without coaching, because the bulk of the test is arithmetic, algebra I, and geometry.

The new PSAT and, presumably, the SAT, is impossible to do unless the student has taken and understood two years of algebra. Some will push back and say oh, don’t be silly, all the linear systems work is covered in algebra I. Yeah, but kids don’t really get it then. Not even many of the top students. You need two years of algebra even as a strong student, to be able to work these problems with the speed and confidence needed to get most of these answers in the time required.

And this is the PSAT, a test that students take at the beginning of their junior year (or sophomore, in many schools), so the College Board has created a test with material that most students won’t have covered by the time they are expected to take the test. As I mentioned earlier, California alone has nearly a quarter of a million sophomores and juniors in algebra and geometry. Will the new PSAT or the SAT be able to accurately assess their actual math knowledge?

Key point: The SAT and the ACT’s ability to reflect a full range of abilities is an unacknowledged attribute of these tests. Many colleges use these tests as placement proxies, including many, if not most or all, of the public university systems.

The difficulty level I see in this new PSAT makes me wonder what the hell the organization is up to. How can the test will reveal anything meaningful about kids who a) haven’t yet taken algebra 2 or b) have taken algebra 2 but didn’t really understand it? And if David Coleman’s answer is “Those testers aren’t ready for college so they shouldn’t be taking the test” then I have deep doubts that David Coleman understands the market for college admissions tests.

Of course, it’s also possible that the SAT will yield the same range of scores and abilities despite being considerably harder. I don’t do psychometrics.



Here’s the function question I missed. I think I get it now. I don’t generally cover this degree of complexity in Precalc, much less algebra 2. I suspect this type of question will be the sort covered in new SAT test prep courses.


These two are fairly complicated quadratic questions. The question on the left reveals that the SAT is moving into new territory; previously, SAT never expected testers to factor a quadratic unless a=1. Notice too how it uses the term “divisible by x” rather than the more common term, “x is a factor”. While all students know that “2 is a factor of 6″ is the same as “6 is divisible by 2″, it’s not a completely intuitive leap to think of variable factors in the same way. That’s why we cover the concept–usually in late algebra 2, but much more likely in pre-calc. That’s when synthetic division/substitution is covered–as I write in that piece, I’m considered unusual for introducing “division” of this form so early in the math cycle.

The question on the right is a harder version of an SAT classic misdirection. The test question doesn’t appear to give enough information, until you realize it’s not asking you to identify the equation and solve for a, b, and c–just plug in the point and yield a new relationship between the variables. But these questions always used to show up in linear equations, not quadratics.

That’s the big news: the new PSAT is pushing quadratic fluency in a big way.

Here, the student is expected to find the factors of 1890:


This is a quadratic system. I don’t usually teach these until Pre-Calc, but then my algebra 2 classes are basically algebra one on steroids. I’m not alone in this.

No doubt there’s a way to game this problem with the answer choices that I’m missing, but to solve this in the forward fashion you either have to use the quadratic formula or, as I said, find all the factors of 1890, which is exactly what the answer document suggests. I know of no standardized test that requires knowledge of the quadratic formula. The old school GRE never did; the new one might (I don’t coach it anymore). The GMAT does not require knowledge of the quadratic formula. It’s possible that the CATs push a quadratic formula question to differentiate at the 800 level, but I’ve never heard of it. The ACT has not ever required knowledge of the quadratic formula. I’ve taught for Kaplan and other test prep companies, and the quadratic formula is not covered in most test prep curricula.

Here’s one of the inexplicable difficulty codings I mentioned–this is coded as of Medium difficulty.

As big a deal as that is, this one’s even more of a shock: a quadratic and linear system.


The answer document suggests putting the quadratic into vertex form, then plugging in the point and solving for a. I solved it with a linear system. Either way, after solving the quadratic you find the equation of the line and set them equal to each other to solve. I am….stunned. Notice it’s not a multiple choice question, so no plug and play.

Then, a negative 16 problem–except it uses meters, not feet. That’s just plain mean.

Notice that the problem gives three complicated equations. However, those who know the basic algorithm (h(t)=-4.9t2 + v0 + s0) can completely ignore the equations and solve a fairly easy problem. Those who don’t know the basic algorithm will have to figure out how to coordinate the equations to solve the problem, which is much more difficult. So this problem represents dramatically different levels of difficulty based on whether or not the student has been taught the algorithm. And in that case, the problem is quite straightforward, so should be coded as of Medium difficulty. But no, it’s tagged as Hard. As is this extremely simple graph interpretation problem. I’m confused.

Recall: if the College Board keeps the traditional practice, the SAT will be more difficult.

So this piece is long enough. I have some thoughts–rather, questions–on what on earth the College Board’s intentions are, but that’s for another test.

tl;dr Testers will get a little more time to work much harder problems. Geometry has disappeared almost entirely. Quadratics beefed up to the point of requiring a steroids test. Inexplicable “calc/no calc” categorization. College Board didn’t rip off the ACT math section. If the new PSAT is any indication, I do not see how the SAT can be used by the same population for the same purpose unless the CB does very clever things with the grading scale.

Evaluating the New PSAT: Reading and Writing

The College Board has released a new practice PSAT, which gives us a lot of info on the new SAT. This essay focuses on the reading and writing sections.

As I predicted in my essay on the SAT’s competitive advantage, the College Board has released a test that has much in common with the ACT. I did not predict that the homage would go so far as test plagiarism.

This is a pretty technical piece, but not in the psychometric sense. I’m writing this as a long-time coach of the SAT and, more importantly, the ACT, trying to convey the changes as I see them from that viewpoint.

For comparison, I used these two sample ACT, this practice SAT (old version), and this old PSAT.


The old SAT had a reading word count of about 2800 words, broken up into eight passages. Four passages were very short, just 100 words each. The longest was 800 words. The PSAT reading count was around 2000 words in six passages. This word count is reading passages only; the SAT has 19 sentence completions to the PSAT’s 13.

So SAT testers had 70 minutes to complete 19 sentence completions and 47 questions over eight passages of 2800 words total. PSAT testers had 50 minutes to complete 13 sentence and 27 questions over six passages of 2000 words total.

The ACT has always had 4 passages averaging 750 words, giving the tester 35 minutes to complete 40 questions (ten for each passage). No sentence completions.

Comparisons are difficult, but if you figure about 45 seconds per sentence completion, you can deduct that from the total time and come up with two rough metrics comparing reading passages only: minutes per question and words per question (on average, how many words is the tester reading to answer the questions).

Metric Old SAT Old PSAT ACT New PSAT
Word Count 2800 2000 3000 3200
Passage Count 8 6 4 5
Passage Length 100-850 100-850 750 500-800
MPQ 1.18 1.49 1.14 1.27
WPQ 59.57 74.07 75 69.21

I’ve read a lot of assertions that the new SAT reading text is more complex, but my brief Lexile analysis on random passages in the same category (humanities, science) showed the same range of difficulty and sentence lengths for old SAT, current ACT, and old and new PSAT. Someone with more time and tools than I have should do an indepth analysis.

Question types are much the same as the old format: inference, function, vocabulary in context, main idea. The new PSAT requires the occasional figure analysis, which the College Board will undoubtedly flaunt as unprecedented. However, the College Board doesn’t have an entire Science section, which is where the ACT assesses a reader’s ability to evaluate data and text.

Sentence completions are gone, completely. In passage length and overall reading demands, the new PSAT is remarkably similar in structure and word length to the ACT. This suggests that the SAT is going to be even longer? I don’t see how, given the time constraints.

tl;dr: The new PSAT reading section looks very similar to the current ACT reading test in structure and reading demands. The paired passage and the questions types are the only holdover from the old SAT/PSAT structure. The only new feature is actually a cobbled up homage to the ACT science test in the form of occasional table or graph analysis.


I am so flummoxed by the overt plagiarism in this section that I seriously wonder if the test I have isn’t a fake, designed to flush out leaks within the College Board. This can’t be serious.

The old PSAT/SAT format consisted of three question types: Sentence Improvements, Identifying Sentence Error, and Paragraph Improvements. The first two question types presented a single sentence. In the first case, the student would identify a correct (or improved) version or say that the given version was best (option A). In the ISEs, the student had to read the sentence cold with no alternatives and indicate which if any underlined word or phrase was erroneous (much, much more difficult, option E was no change). In Paragraph Improvements, the reader had to answer grammar or rhetoric questions about a given passage. All questions had five options.

The ACT English section is five passages running down the left hand side of the page, with underlined words or phrases. As the tester goes along, he or she stops at each underlined section and looks to the right for a question. Some questions are simple grammar checks. Others ask about logic or writing choices—is the right transition used, is the passage redundant, what would provide the most relevant detail. Each passage has 15 questions, for a total of 75 questions in 45 minutes (9 minutes per passage, or 36 seconds per question). The tester has four choices and the “No Change” option is always A.

The new PSAT/SAT Writing/Language section is four passages running down the left hand side of the page, with underlined words or phrases. As the tester goes along, he or she stops at each underlined section and looks to the right for a question. Some questions are simple grammar checks. Others ask about logic or writing choices—is the right transition used, is the passage redundant, what would provide the most relevant detail. Each passage has 11 questions, for a total of 44 questions in 35 minutes (about 8.75 minutes per passage or 47 seconds a question). The tester has four choices and the “No Change” option is always A.

Oh, did I forget? Sometimes the tester has to analyze a graph.

The College Board appears to have simply stolen not only the structure, but various common question types that the ACT has used for years—as long as I’ve been coaching the test, which is coming on for twelve years this May.

I’ll give some samples, but this isn’t a random thing. The entire look and feel of the ACT English test has been copied wholesale—I’ll add “in my opinion” but don’t know how anyone could see this differently.

Writing Objective:

Style and Logic:


tl;dr: The College Board ripped off the ACT English test. I don’t really understand copyright law, much less plagiarism. But if the American College Test company is not considering legal action, I’d love to know why.

The PSAT reading and writing sections don’t ramp up dramatically in difficulty. Timing, yes. But the vocabulary load appears to be similar.

The College Board and the poorly informed reporters will make much of the data analysis questions, but I hope to see any such claims addressed in the context of the ACT’s considerably more challenging data analysis section. The ACT should change the name; the “Science” section only uses science contexts to test data analysis. All the College Board has done is add a few questions and figures. Weak tea compared to the ACT.

As I predicted, The College Board has definitely chosen to make the test more difficult for gaming. I’ve been slowly untangling the process by which someone who can barely speak English is able to get a high SAT verbal and writing score, and what little I know suggests that all the current methods will have to be tossed. Moving to longer passages with less time will reward strong readers, not people who are deciphering every word and comparing it to a memory bank. And the sentence completions, which I quite liked, were likely being gamed by non-English speakers.

In writing, leaving the plagiarism issue aside for more knowledgeable folk, the move to passage-based writing tests will reward English speakers with lower ability levels and should hurt anyone with no English skills trying to game the test. That can only be a good thing.

Of course, that brings up my larger business question that I addressed in the competitive advantage piece: given that Asians show a strong preference for the SAT over the ACT, why would Coleman decide to kill the golden goose? But I’ll put big picture considerations aside for now.

Here’s my evaluation of the math section.

Designing Multiple Answer Math Tests

I got the idea for Multiple Answer Tests originally because I wanted to prepare my kids for Common Core Tests. (I’d rather people not use that post as the primary link, as I have done a lot more work since then.)

About six months later (a little over a year ago), I gave an update, which goes extensively into the grading of these tests, if you’re curious. At that time, I was teaching Pre-Calc and Algebra 2/Trig. This past year, I’ve been teaching Trigonometry and Algebra II. I’d never taught trig before, so all my work was new. In contrast, I have a lot of Algebra 2 tests, so I often rework a multiple choice question into a multiple answer.

I thought I’d go into the work of designing a multiple answer test, as well as give some examples of my newer work.

I design my questions almost in an ad hoc basis. Some questions I really like and keep practically intact; others get tweaked each time. I build tests from a mental question database, pulling them in from tests. So when I start a new test, I take the previous unit test, evaluate it, see if I’ve covered the same information, create new questions as needed, pull in questions I didn’t use on an earlier test, whatever. I don’t know how teachers can use the same test time and again. I’d get bored.

I recently realized my questions have a typology. Realizing this has helped me construct questions more clearly, sometimes adding a free response activity just to get the students started down the right path.

The first type of question requires modeling and/or solving one equation completely. The answer choices all involve that one process.



I’m very proud of this question. My kids had learned how to graph the functions, but we hadn’t yet turned to modeling applications. So they got this cold, and did really well with it. (In the first class, anyway. We’ll see how the next group does in a month or so.) I had to design it in such a way to really telegraph the question’s simplicity, to convince the students to give it a shot.

Algebra II:

The rational expression question is incredibly flexible. I’m probably teaching pre-calc again next year and am really looking forward to beefing this question up with analysis.

Other questions are a situation or graph that can be addressed from multiple aspects. The student ends up working 2 or 3 actual calculations per question. I realized the questions look the same as the previous type, but they represent much more work and I need to start making that clear.



Algebra II:

I love the Pythagorean Ruler question, which could be used purely for plane geometry questions, or right triangle trig. Or both. The furniture question is an early draft; I needed an inverse question and wanted some linear modeling review, so I threw together something that gave me both.

I can also use this format to test fluency on basic functions very efficiently. Instead of wasting one whole question on a trig identity, I can test four or five identities at once.


Or this one, also trig, where I toss in some simplification (re-expression) coupled with an understanding of the actual ratios (cosine and secant), even though they haven’t yet done any graphing. So even if they have graphing calculators (most don’t), they wouldn’t know what to look for.


I’m not much for “math can be used in the real world” lectures, but trigonometry is the one class where I can be all, “in your FACE!” when kids complain that they’d never see this in real life.


I stole the above concept from a trig book and converted to multiple answer, but the one below I came up with all by myself, and there’s all sorts of ways to take it. (and yes, as Mark Roulo points out, it should be “the B29’s circumference is blah blah blah.” Fixed in the source.)


Some other questions for Algebra II, although they can easily be beefed up for pre-calc.



One of the last things I do in creating a test is consider the weight I give each question. Sometimes I realize that I’ve created a really tough question with only five answer choices (my minimum). So I’ll add some easier answer choices to give kids credit for knowledge, even if they aren’t up to the toughest concepts yet.

That’s something I’ve really liked about the format. I can push the kids at different levels with the same question, and create more answer choices to give more weight to important concepts.

The kids mostly hate the tests, but readily admit that the hatred is for all the right reasons. Many kids used to As in math are flummoxed by the format, which forces them to realize they don’t really know the math as well as they think they do. They’ve really trained their brains to spot the correct answer in a multiple choice format–or identify the wrong ones. (These are the same kids who have memorized certain freeform response questions, but are flattened by unusual situations that don’t fit directly into the algorithms.)

Other strong students do exceptionally well, often spotting question interpretations I didn’t think of, or asking excellent clarifications that I incorporate into later tests. This tells me that I’m on the right track, exposing and differentiating ability levels.

At the lower ability levels, students actually do pretty well, once I convince them not to randomly circle answers. So, for example, on a rational expression question, they might screw up the final answer, but they can identify factors in common. Or they might make a mistake in calculating linear velocity, but they correctly calculate the circumference, and can answer questions about it.

I’ve already written about the frustrations, as when the kids have correctly worked problems but didn’t identify the verbal description of their process. But that, too, is useful, as they can plainly see the evidence. It forces them to (ahem) attend to precision.

Of course, I’m less than precise myself, and one thing I really love about these tests is my ability to hide this personality flaw. But if you spot any ambiguities, please let me know.

Ian Malcolm on Eva Moskowitz


Another good piece documenting the lack of “there” at the Success Academy schools, this one by Kate Taylor at the Times.

Pretend that Judge Patrice Lessner is interrupting me every four words for this next bit:

Success Academies’ “success” will eventually be revealed as a chimera. Certainly they are skimming on a massive scale, and their attrition rates over time are pretty telling. Despite Moskowitz’s constant denials,the kids spend a shocking amount of time in test prep—one witness even saw an early slam the exam class.

But skimming, test prep, and attrition don’t explain enough. If Carol Burris is providing correct information here, then 45% of whites were proficient in math, and 31% in ELA. According to Robert Pondiscio, the numbers for the overwhelmingly low income black and Hispanic Success Academies were over 90% and 68%, respectively. That suggests the schools are doing more than cherrypicking.

I don’t know how. Unlikely to be anything as obvious as fixing the tests later or telling the kids the answers, or we’d hear about it. Possibly they are engaging in the Chinese variety of test prep.

But if low income black and Hispanic proficiency rates are twice that of whites, then the dinosaurs have escaped.

Paul Bruno is more careful, less intuitive (in his writing) and far more data-driven than, say, me. So maybe everyone doesn’t read his explication of everything we don’t know about Success Academy as howlingly skeptical, but nor would anyone see the piece as a ringing endorsement. More surprisingly, Robert Pondiscio asks “what the hell is going on at Success Academy? in a way that doesn’t sound very flattering.

In no way are Bruno or Pondiscio going out on the ledge with me. Not for them the wise words of Ian Malcolm. I’m just saying that their articles signal considerable skepticism to me, a frequent reader of both.

I haven’t seen many respectable reformers touting Success Academy, either. Take that as you will.

Here’s a story idea for some enterprising reporter:

Contact Success Academy and ask to see score progressions for their early students. Presumably, all the students didn’t come in scoring at the top level (don’t laugh, skeptics!). So Eva and her minions should be able to provide initial scores for students–they are testing them constantly, yes?–and connect these scores to their actual state exam scores. By year. Then that enterprising reporter should track down Success Academy alumni and get their scores year by year since they’ve left. In a year, that could include SAT/ACT scores.

This would provide actual data to answer the following questions:

  1. Are the weakest students leaving the schools?
  2. Are specific students improving their demonstrated abilities during their tenure at the schools?
  3. Are alumni still doing well after they leave school?

Those questions would eliminate or at least reduce the charges of skimming, attrition, and prepping-to-the-extent-of-cheating.

I note that Kate Taylor or the Times is looking for students or parents to “share their stories”. Less stories. More data. Get test scores over time per student, stat!

If I’m wrong, nothing happens! No one gets fired. I’m just an amateur. It’s not like I’m claiming a frat party instigated a gang rape, or anything. And oh, yeah, the achievement gap that has plagued our education efforts for over fifty years has finally been beaten.

So if I’m wrong, someone should go look for Isla Nublar to see if the T-Rex has eaten all the velociraptors.

Illustrating Functions

Function definitions aren’t usually tested on either the SAT or the ACT and since I never worked professionally with math, functions were something I’d barely considered in algebra a billion years ago. So for the first few years of teaching, I kind of went through the motions on functions: unique output for each input, vertical line test, blah blah. I didn’t ignore them or rush through them. But I taught them in straight lecture mode.

Once I got out of the algebra I ghetto (which really does warp your brain if that’s all you do), I accepted that a lot of the concepts I originally thought boring or unimportant show up later. So it’s worth my time to come up with the same third way activities and lessons for things like functions or absolute value or inverses that I do for binomial multiplication and modeling linear equations and inequalities.

So every year I pick concepts to transfer from pure lecture/explanation to illustration. Sometimes it’s spur of the moment, other times I plan a formal curriculum change. In the case of functions, the former.

Last year I was teaching algebra II/trig and–entirely in passing–noted a problem in the Holt book that looked something like this:

and had two simultaneous thoughts: what a boring question and hey, I could really do something with that.

So the next day, I tossed this up on the board without comment.


I’ve given these instructions three times now–a2/trig, trigonometry, algetbra 2–and the kids are always mystified, but what the heck, the activity seems simple enough. No student ever reads through the entire list of instructions first. They spend a lot of time picking the message, with many snickers, then have fun translating the code twice.

But then, as they all try to translate someone else’s message using the cell phone code, bam. They realize intuitively that translating the whole-alphabet code would be an easy task. And with a few moments of thought, they realize why the cell phone code doesn’t offer the same simple path. They don’t know what it means, exactly. But the students all realize that I’ve demonstrated a difference that they’d never considered.

From there, I graph the processes, which is usually a surprise as well. The translation process can be graphed?



At this point, I can usually convince kids to remember the Vertical Line Test, which they were taught in algebra I. At that point, I go through the definitions for relation, function, and one-to-one function, using a Venn diagram (something like this with an addition inner circle for one to ones). Then I go back through what the students vaguely remember about functions and link it to the correct code example.

Thus the students realize that translating the message into code is a function in either code key. I hammer this point home, because the most common misconception kids get from this is that all functions must be one to one. Both are functions. Each letter has one and only one number assigned, and the fact that one translation key puts several letters to the same number is irrelevant for its determination as a function. Reversing the process, going from numbers to letters, only one of them is a function.

Then I sketch parabolas and circles. Are they both functions? Are either of them one-to-one functions?

In Algebra 2, I do this long before the inverse unit. In Trig, I introduce it right before graphing the individual functions as part of an overview. In both classes, the early intro gives them time to recognize the significance of the difference between a function and the more limited case of the one-to-one function–particularly in trig, since the inverse functions are very limited graphs for exactly the reason. In algebra II, the graphs reinforce the meaning of the Horizontal Line Test.

I haven’t taught algebra I recently, but I’d change the lesson by giving them a coded message and ask them to translate with the cell phone code first.

This leads right into function and not-function, which is all they need in algebra I.

I have periodically mentioned my mixed feelings about CPM. Here’s a classic example. The CPM book introduces functions with the following example.

Okay. This is a terrible example. And really boring. Worst of all, as far as this non-mathie can tell, towards the end it’s flat out wrong. A relation can be predictable without being a function (isn’t that what a circle is?). But just looking at it, I got an idea for a great test question (click to enlarge):


And I could certainly see some great Algebra I activities using the same concept. But CPM just sucks the joy and interest out of the great starting ideas it has.

Anyway. I wanted to finish up with a push for illustrations. What, exactly, do the students understand at the moment of discovery in this little activity? I’ve never seen anyone make the intuitive leap to functions. However, they do all grasp that two tasks that until that moment seemed identical…aren’t. They all realize that translating the message in the whole-alphabet code would be a simple task. They all understand why the cell phone code translation doesn’t lend itself to the same easy translation.

I look for illustrative tasks that convince kids to think about concepts. As I’ve said before, the tasks might kick off a unit, or they might show up in the middle. They may demonstrate a phenomenon in math, or they might be problems designed to lead the students to the next step.

The most common pushback I get from math teachers when I talk about this method is “I love the idea, but I don’t have enough time.” To which I respond that pushing on through just means they’ll forget. Well, they’ll probably forget my lessons, too, but–maybe not so much. Maybe they’ll have more of a memory of the experience, a recollection of the “aha” that got them there. That’s my theory, anyway.

There’s no question that telling is quicker than illustrating or letting them figure it out for themselves. Certainly, the illustration should be followed by a clear explanation with much telling. I love explaining. But I’ve stopped kidding myself that a clear explanation is sufficient for most kids.

That said, let me restate what I said in my retrospective: The tasks must either be quick or achievable. They must illustrate something important. And they must be designed to lead the student directly to the observations or principles you want them to learn. It’s not a do it yourself walk in the park. Compare my lesson on exploring triangles with this more typical reform math example. I resist structure in many aspects of my life, but not curriculum.

In researching this piece, I stumbled across this really excellent essay Why Illustrations Aid Understanding by David Kirsh. I strongly recommend giving it a read. He is only discussing the importance of visual illustrations, whereas I’m using the word more broadly. Kirsh comes up with so many wonderful examples (math and otherwise) that categorize many different purposes of these illustrations. Truly great mind food. In the appendix, he discusses the limitations of visually representing uncertainty.


On reading this, I felt like my students did when they realized the cell phone message was much harder to translate: I have observed something important, something that I realize immediately is true and relevant to my work–even if I don’t yet know why or how.


Get every new post delivered to your Inbox.

Join 1,102 other followers