Teaching Math vs. Doing Math

Justin Reich of EdWeek (not to be confused with Justin Baeder of EdWeek) wrote enthusiastically of a new study, asking What If Your Word Problems Knew What You Liked?:

Last week, Education Week ran an article about a recent study from Southern Methodist University showing that students performed better on algebra word problems when the problems tapped into their interests. …The researchers surveyed a group of students, identified some general categories of students’ interests (sports, music, art, video games, etc.), and then modified the word problems to align with those categories. So a problem about costs of of new home construction ($46.50/square foot) could be modified to be about a football game ($46.50/ticket) or the arts ($46.50/new yearbook). Researchers then randomly divided students into two groups, and they gave one group the regular problems while the other group of students received problems aligned to their interests.

The math was exactly the same, but the results weren’t. Students with personalized problems solved them faster and more accurately (emphasis mine), with the biggest gains going to the students with the most difficulty with the mathematics. The gains from the treatment group of students (those who got the personalized problems) persisted even after the personalization treatment ended, suggesting that students didn’t just do better solving the personalized problems, but they actually learned the math better.

Reich has it wrong. From the study:

Students in the experimental group who received personalization for Unit 6 had significantly higher performance within Unit 6, particularly on the most difficult concept in the unit, writing algebraic expressions (10% performance difference, p<.001). The effect of the treatment on expression-writing was significantly larger (p<.05) for students identified as struggling within the tutoring environment1 (22% performance difference). Performance differences favoring the experimental group for solving result and start unknowns did not reach significance (p=.089). In terms of overall efficiency, students in the experimental group obtained 1.88 correct answers per minute in Unit 6, while students in the control group obtained 1.56 correct answers per minute. Students in the experimental group also spent significantly less time (p<.01) writing algebraic expressions (8.6 second reduction). However, just because personalization made problems in Unit 6 easier for students to solve, does not necessary mean that students learned more from solving the personalized problems.

(bold emphasis mine)

and in the Significance section:

As a perceptual scaffold (Goldstone & Son, 2005), personalization allowed students to grasp the deeper, structural characteristics of story situations and then represent them symbolically, and retain this understanding with the support removed. This was evidenced by the transfer, performance, and efficiency effects being strongest for, or even limited to, algebraic expression-writing (even though other concepts, like solving start unknowns, were not near ceiling).

So the students who got personalized instruction did not demonstrate improved accuracy, at least to the same standard as they demonstrated improved ability to model.

I tweeted this as an observation and got into a mild debate with Michael Pershan, who runs a neat blog on math mistakes. Here’s the result:

I’m like oooh, I got snarked at! My own private definition of math!

But I hate having conversations on Twitter, and I probably should have just written a blog entry anyway.

Here’s my point:

Yes, personalizing the context enabled a greater degree of translation. But when did “translating word problems” become, as Michael Pershan puts it, “math”? Probably about 30 years old, back when we began trying to figure out why some kids weren’t doing as well in math as others were. We started noticing that word problems gave kids more difficulty than straight equations, so we start focusing a lot of time and energy on helping students translate word problems into equations—and once the problems are in equation form, the kids can solve them, no sweat!

Except, in this study, that didn’t happen. The kids did better at translating, but no better at solving. That strikes me as interesting, and clearly, the paper’s author also found it relevant.

Pershan chastised me, a tad snootily, for saying the kids “didn’t do better at math”. Translating math IS math. He cited the Common Core standards showing the importance of data modeling. Well, yeah. Go find a grandma and teach her eggsucking. I teach modeling as a fundamental in my algebra classes. It makes sense that Pershan would do this; he’s very much about the why and the how of math, and not as much about the what. Nothing wrong with this in a math teacher, and lord knows I do it as well.

But we shouldn’t confuse the distinction between teaching math and doing it. So I asked the following hypothetical: Suppose you have two groups of kids given a test on word problems. Group 1 translates each problem impeccably into an equation that is then solved incorrectly. Group 2 doesn’t bother with the equations but gives the correct answer to each problem.

Which group would you say was “better at math”?

I mean, really. Think like a real person, instead of a math teacher.

Many math teachers have forgotten that for most people, the point of math is to get the answer. Getting the answer used to be enough for math teachers, too, until kids stopped getting the answer with any reliability. Then we started pretending that the process was more important than the product. Progressives do this all the time: if you can’t explain how you did it, kid, you didn’t really do it. I know a number of math teachers who will give a higher grade to a student who shows his work and “thinking”, even if the answer is completely inaccurate, and give zero credit to a correct answer by a student who did the work in his head.

Not that any of this matters, really. Reich got it wrong. No big deal. The author of the study did not. She understood the difference between translating a word problem into an equation and getting the correct answer.

But Pershan’s objection—and, for that matter, the Common Core standards themselves—shows how far we’ve gone down the path of explaining failure over the past 30-40 years. We’ve moved from not caring how they defined the problem to grading them on how they defined the problem to creating standards so that now they are evaluated solely on how they define the problem. It’s crazy.

End rant.

Remember, though, we’re talking about the lowest ability kids here. Do they need models, or do they need to know how to find the right answer?

About educationrealist


27 responses to “Teaching Math vs. Doing Math

  • Hattie

    I spent my secondary school years in the second to last maths set, experiencing the mindset you describe from my teachers.

    Frankly, I think it’s a calculated way of inflating scores for those of us who are on the left hand side of the Bell Curve. It’s a way of giving out as many marks as possible – the pupil can’t find the answer, but if they only get derailed halfway through, they can get half the marks, and teachers can claim that their pupils are improving. The more dedicated teachers I had also used it as a way of finding out exactly what parts we didn’t get, but the exam part seemed more important.

    It’s part of a wider pattern that makes me believe that the Powers That Be know perfectly well about innate ability and the differences between groups. Not only do they keep their own children and grandchildren as far away as possible from the low ability pupils and their schools, but they use every trick in the book to boost scores. They know it won’t happen otherwise, but they fear the political fallout. They certainly don’t want to ask the hard questions about how best to work with those with below average IQs.

  • Roger Sweeny

    Thanks for the accurate summary of this study, and for bringing up the basic question that you do at the end, “Remember, though, we’re talking about the lowest ability kids here. Do they need models, or do they need to know how to find the right answer?” I wish I had an answer. I’m guessing you lean toward the latter, and I probably do too.

    I have another question that is totally unrelated. This is the second time in the blog that you have made a reference to eggsucking. I feel like there is a meaning I am missing, some implication or some story or fable. Can you help me?

    • educationrealist

      Back in the old days, old people had no teeth, and the only thing they could eat was eggs. They stuck a needle into the shell and sucked out the egg because they couldn’t chew. So they were experts at eggsucking. I think the real phrase is “it’s like teaching a grandma to suck eggs”. I do things like “eggsucking 101” or “go find a grandma.

      I often tell my kids “Hey. Steal the sheep.” I used to say “Pot to Kettle: You’re black!” but I can’t these days!

  • maiseylou

    Well, it all seems to add up.

  • Φ

    Maybe I’m missing something here. My experience teaching algebra is limited to homeschooling my daughter, who has above average aptitude in math. I taught her the way I was taught: there are no points for “just the answer”, show your work, and follow the process. Yes, she could work the simple problems in her head, but solving those problems in that way would poorly prepare her for the harder problems.

    So your hypothetical is a bit strained. In general, very few students are going to accurately solve moderately difficult algebra problems “in their heads” with any reliability.

    • educationrealist

      My experience teaching algebra is limited to homeschooling my daughter, who has above average aptitude in math.

      Stop there. All done! You have no frame of reference.

      Well, not all done. Where did I say they solved them in their heads?

      And frankly, the “there are no points for just the answer” is kind of silly. Certainly, you should teach her how to set up problems. Once she knows how, what do you care if she does them that way or not?

  • Michael Pershan (@mpershan)

    We definitely still disagree on this, but I apologize for any snoot.

    • educationrealist

      I think this disagreement is fundamental to the success or failure in teaching low ability students. I also suspect that you have no idea what I’m talking about, and have decided that I teach kids that math is all about getting the answer.

      • Michael Pershan (@mpershan)

        I also suspect that you have no idea what I’m talking about, and have decided that I teach kids that math is all about getting the answer.

        I guess that’s possible. How about this: I’ll say what I think you’re getting at, and you’ll tell me where I go wrong?

        I don’t think that you teach kids that math is all about getting the answer. I think that you teach kids that the process is important. I think that you teach kids that modeling is important. But I think that you emphasize the process and modeling because you think that it will help them find more numerical answers to more problems.

        Multiple approaches, modeling, abstracting words into symbols: these matter because they’ll help kids figure out more answers to more questions.

        Is that right?

        If it is right, where I disagree is that I think the process matters. And while you suggest that, sociologically, I like the process because it covers up the difficulty of getting my weakest kids towards an actual precise answer, I disagree. I think that what math really is involves modeling and innovation in the process.

        Your response: well, you’re thinking about what mathematicians do. Real people? They need answers. They need to use math to find answers.

        That doesn’t really speak to me. I’ve got a lot of friends who don’t know math. They don’t know algebra. At the end of the year, my lowest level algebra students will be able to out perform a ton of them. And these friends? They’re psychologists, lawyers, writers, and sometimes doctors and teachers.

        To me, this is evidence that in the real world, real people rarely have to use algebra, or (more generally) any of the math that we teach in high school. So I’m not really sympathetic to the claim that real people need to be able to find answers to problems. Looking around at the people I see, that just doesn’t seem to be the case.

        So why teach math? While I won’t fight for math over art, philosophy, physics or poetry, I do think that there is immense value in having kids learn and enjoy learning our most difficult disciplines.

        Anyway, this is how I understand our back-and-forth, so far. Looking forward to your response.

      • educationrealist

        There’s a fair amount I could comment on in this post, but I’m going to focus on the paragraphs in which you assert that your lowest performing kids could outperform many lawyers, psychologists, writers, doctors, and even (gasp) teachers.

        In order for you to make that statement, one of the following cases, at least, is true:
        1) You don’t, in fact, teach low ability students (IQs below 100)

        2) You confuse the distaste of professionals for math with actual inability.

        3) You are utterly and completely delusional.

        I’m assuming it’s not #3.

        The determination to pretend cognitive ability doesn’t exist is the most significant challenge our educational system faces today. So our big disagreement is not in any of the things you laid out, where I doubt we disagree at all, but in your inability to understand (#1) or refusal to acknowledge (#2) what it is to teach a low ability student.

        Lawyers, doctors, etc, have average IQs well in excess of 110. On their worst day, years from having taken math, they can outperform your weakest students or you don’t know what weak students are, and that means we can’t really have a discussion.

        So when I talk about the unimportance of the model, I’m talking about the lower half or so of the ability spectrum, who are going to struggle significantly even to pretend to do algebra, by translating it into concrete examples. That is, after all, the population the study was examining, and the population I am focusing on when I push back on your “modeling is math” ideas. That population doesn’t need models. It needs answers. It needs success. And a method that gives them the models, but doesn’t give them a better chance at getting the answer, isn’t much of a method unless we’re in a world where people pretend that building the model is a goal in and of itself for a kid with an IQ below 100.

  • Michael Pershan (@mpershan)

    I don’t think that my comment about the mathematical ability of various professionals was crucial to my argument. What is crucial to my argument is the mathematical needs of various professionals. In other words, I’m happy to agree to disagree about how good various professionals are at math (and you helpfully offer that either I’m crazy or my kids aren’t actually dumb, so, there you go.)

    But let’s start from this point: we agree that various professionals don’t need high school math, for the most part. That applies for both high-performing as well as low-performing professions.

    You’ve said that low-performing kids need to know how to find the right answer.

    Why? After all, they’re not going to need to solve these problems after school. So why do they need to be able to find the right answers? They’re never going to be posed these problems?

    Based on a comment that you made on twitter, I think that you’re inclined to respond that low-performing students don’t need to be able to find answers for professional reasons. They need to be able to find them because that’s their understanding of what math is.

    But that seems like a punt, to me. Analogy: kids commonly think that history is about memorizing dates, names, stories and events. This is especially true for low-performing kids. Is that an argument for restricting the history curriculum to memorizing these things and leaving out discussions of evidence and careful reasoning?

    Math should be no different. Low-performing kids might think that math is about finding the answer, but they might be wrong. And, more to the point, we can change the way that they think about math as a discipline. Instead of thinking that math is about finding answers, we can teach them that math is also about prediction of the future, proof, and modeling.

    P.S. Even though you were kind enough to assume that I’m not delusional, I think that it came off the wrong way. It seemed like you were suggesting that I’m SOOO off base, that I’m delusional. That’s not very kind, nor is it generous. Again, apologies for any snoot in our first encounter.

    • educationrealist

      First things first: yes, you are that off base. But I know you’re not delusional. You just share the same fantasies that most of the educational polity does. I have no time for the fantasies. Any time someone seriously suggests that the highest IQ professions are in any way equivalent to low performers in our school system, they are ignoring the reality of cognitive ability and that’s not an opinion I will grant any credence to, because people who subscribe to it—either knowingly to perpetuate an illusion or ignorantly because they don’t know any better—are simply making the problem worse. My blog is dedicated in large part to pointing out these idiocies and the damage they do to our students. So yes, you are that off-base. But my stern response was not any in-kind for the perceived snoot of your first response (which I thought was amusing and did not damage my ego in any way).

      Analogy: kids commonly think that history is about memorizing dates, names, stories and events. This is especially true for low-performing kids. Is that an argument for restricting the history curriculum to memorizing these things and leaving out discussions of evidence and careful reasoning?

      Yes. And I’m a history teacher. I’ve said this before. History for low ability kids should be primarily about memorizing dates, names, stories, and events. The content knowledge helps their reading ability.

      That doesn’t mean that kids can’t enjoy fascinating conversations about the why and the how—in history or in math, or in literature, for that matter. But it does mean that I want them knowing the facts of American history without being forced to write critical reasoning essays on Why Segregation Was A Very Very Very Bad Thing. It means that fact mastery is important, and that people with fewer abstraction abilities who prefer concrete situations actually enjoy learning about facts and feeling the mastery that comes with that knowledge.

      After all, they’re not going to need to solve these problems after school. So why do they need to be able to find the right answers? They’re never going to be posed these problems?

      Math, at its most utilitarian level, is about using facts and processes to solve problems. This training is good for any kid going off into the real world, who will also have to use a fact base and processes to solve problems at his own level of ability. We need to give kids as much of that experience as we can, and math is an excellent substitute for the real world in that sense, because unlike history or English, we don’t debate the facts or the processes. The trick is finding the appropriate level of problems for the skillsets and abilities of the kids involved. For a good half the population, that means ENDING at algebra, not starting there. We have no evidence that algebra can be genuinely understood by people with IQs under 100 at this point in time. And that’s a research study that will never happen, but it should. Because without it, we have no evidence that our educational policy is anything other than delusion.

      So I don’t have a lot of time for politeness with people who ignore cognitive ability—it’s fine to disagree with me about it, but ignore it? It’s not about you, or your blog, which I find very good. But I don’t understand why or how anyone would ignore cognitive ability, and that’s what you were doing in your last post.

      • Michael Pershan (@mpershan)

        This training is good for any kid going off into the real world, who will also have to use a fact base and processes to solve problems at his own level of ability.

        Any evidence for this? In general, transfer between domains does not come automatically, and I’m very skeptical that mathematical habits transfer to workplace skills all on their own.

  • educationrealist

    Sorry I didn’t get to this sooner. I’m not talking about transfer between domains. I’m just saying that kids who sit in math classes solving problems—just using their brain to solve problems—are learning how to deal with rules, processes, and constraints, just in the way real life also gives us all experience in dealing with rules, processes and constraints.

    That is, if you accept that low ability kids can’t do abstract work and thus aren’t suited for college or much in the way of symbol manipulation, what does their education consist of? It could consist of just sending them off to the factory, except we don’t have one. So in that sense, giving them interesting problems that challenge their mind appropriately that force them to deal with rules and constraints is as good a way as any and much better than most as a way for them to use their mind while in school.

  • Michael Pershan (@mpershan)

    You write,

    So in that sense, giving them interesting problems that challenge their mind appropriately that force them to deal with rules and constraints is as good a way as any and much better than most as a way for them to use their mind while in school.

    I assumed that it was better for them to use their minds in school because it would benefit them later in life. Apparently, you didn’t mean that.

    It seems to me as if you’re suggesting that there isn’t much point of school for low-performing kids. You’d send them out of school if you could, but you can’t, so you might as well give them something that they can handle so that they don’t feel awful about themselves. Is that right? Or do I have you totally off? (Sorry if I do; trying to read you accurately.)

    If that’s true (and I might have misunderstood you), then who cares whether we teach low-performing kids math that involves getting a right answer at the end, or teaching them proof and modeling as ends in themselves. (You said earlier that low-performing kids need to learn how to find the right answer. That’s what I’m pushing on.)

    • educationrealist

      Hey, I missed this comment somehow.

      It seems to me as if you’re suggesting that there isn’t much point of school for low-performing kids. You’d send them out of school if you could, but you can’t, so you might as well give them something that they can handle so that they don’t feel awful about themselves. Is that right?

      To some extent, yes. Our economy has changed; these kids could early have gotten manufacturing or other jobs in which they could achieve (to varying degrees) based on their ability to achieve, not their interest in abstract math well above the needs of the job. So yes, in earlier days, these kids would not have been an educational problem.

      Our willingness to accept that our original goal–educating every to be a symbol manipulator, rather than a concrete doer–is completely out of the question will be the primary determinant of our educational success moving forward. Right now, our unwillingness to accept this is the reason we are declaring our schools a failure when they aren’t.

      who cares whether we teach low-performing kids math that involves getting a right answer at the end, or teaching them proof and modeling as ends in themselves.

      As long as a kid is in school, I want to be giving that kid intellectual problems that spur their interest in thinking about “Why”, “What”, and “How” to the best ability they can. That strikes me as the purpose of school. Some kids will be interested in that more than others—regardless of ability. But no one is harmed by stretching their brain. This seems pretty obvious, to me. School has more than one function; it’s not just about creating good industrious workers for society. It’s also about creating good citizens, good thinkers, and people who feel vested in society. If our school system gives kids the sense that we not only care about them, but value their abilities and interests where they are, not where we delude ourselves they should be, it strikes me as a good thing.

      • Michael Pershan (@mpershan)

        This makes a lot of sense to me:

        As long as a kid is in school, I want to be giving that kid intellectual problems that spur their interest in thinking about “Why”, “What”, and “How” to the best ability they can. That strikes me as the purpose of school.

        But I don’t see how that’s consistent with your post. You also said,

        Remember, though, we’re talking about the lowest ability kids here. Do they need models, or do they need to know how to find the right answer?

        If we’re just trying to spur their interest then why do kids need to know how to find the right answer? That’s what I’m having trouble putting together about your position. Why not teach modeling, if that will interest children and spur their interest?

      • educationrealist

        We aren’t debating whether to teach modeling. You have already agreed that I do teach modeling, which for low ability kids is often just a matter of having them visualize what would actually happen if the events in the word description took place. I teach them to model the problems BECAUSE it helps them to get the right answer. Because low to mid ability kids are far more utilitarian. They aren’t interested in problems that don’t have answers.

        Remember, this all began because you thought I was wrong to quibble about the article, which said that the kids got more right answers. In fact, the treatment did not result in more right answers, simply more accurate models–that were then solved incorrectly. But you think the more accurate models (which in this case, were equations) is a “right answer” in and of itself. I’m simply saying it’s not true. There may be kids for whom an interesting model is more important than the right answer, but they aren’t kids who struggle to pass algebra. They need the right answer.

  • David Wees

    Recommended reading: http://en.wikipedia.org/wiki/Pygmalion_effect

    Is it possible that your own beliefs about what the low performing kids are capable of create a ceiling for their capabilities?

    • educationrealist

      No.

      1) https://educationrealist.wordpress.com/2012/07/01/the-myth-of-they-werent-ever-taught/

      2) I have no idea what my kids’ scores are in any objective sense when I first meet them, no access to test scores–at least, I haven’t 3 years out of 4. I give my kids a test the first day of class, a test much easier than the class they are entering (https://educationrealist.wordpress.com/2012/09/10/my-math-classes-are-they-prepared-um-no-so-what/). Most, but not all, of my students match that performance over the year–that is, the students who ace that test continue to ace the class, the students who tank struggle. But it’s not a perfect correlation. Some students finally start to “get it” that particular year. Additionally, a few students who do well on the test, which is highly structured and predictable, struggle in the class. If I were being fooled by the Pygmalion effect, that wouldn’t happen.

      3) I am a very good assessor of in-class abilities, and in the first two years of teaching, I continually underestimated the degree to which some kids were cheating. So I’d always be running into the case of a kid who man, I’d thought didn’t have a clue and yet was killing on the tests. In the early days, I trusted the kids’ performance on my tests rather than my own effect–that is, I trusted what I thought was demonstrated performance over Pygmalion. However, when I got it together enough to create 2, 3, or 4 tests to offset cheating (which, in the early days, took me a couple months to put in place), the kids who I thought didn’t know a thing didn’t, in fact, know a thing when they couldn’t cheat. They all had the answers to the wrong test–the answers on the test of their neighbors. A couple years of this left me more confident of my own assessments–now, if I see a kid overperforming my assessments, I covertly check for cheating.

      4) Both as a test prep instructor and a teacher, I am notable for the degree to which I defy the Pygmalion effect. I always have kids who had previously done poorly in math because they didn’t do things the way the teacher liked, didn’t follow the rules, didn’t do homework and who do very well in my class. I also have kids who learned for the first time that they weren’t good at math, that they’d done well by following the rules, doing homework, but not gaining understanding.

      5) I’m a Pygmalion skeptic, but to the degree it holds, it holds for achievement, not actual ability. And my classes are all about demonstrated ability. You screw around in my class, never do your homework, but nail the tests, you’re getting an A. The Pygmalion effect is about “soft” skills, and the ability to please a teacher–and that’s not a major factor in my classes.

  • Michael Pershan (@mpershan)

    Because low to mid ability kids are far more utilitarian. They aren’t interested in problems that don’t have answers.

    You’re wrong, in my experience. I teach very weak kids, and they are very interested in problems that don’t have answers. But at least we agree: we should teach kids the math that they will find challenging and interesting. That’s pretty significant, and I’m glad we agree.

    Still, I don’t think that what you’re saying now jives with what you wrote in your post:

    Getting the answer used to be enough for math teachers, too, until kids stopped getting the answer with any reliability. Then we started pretending that the process was more important than the product.

    Is the problem with teaching modeling (for the sake of modeling) to low-ability kids that they won’t be interested in it? Or is it that it’s a lowering of standards, because the product is (in fact) more important than the process?

    • educationrealist

      You were comparing your kids to doctors and lawyers. With due respect, you weren’t working with low level kids if that were true. So I’ll keep it simple: low ability kids have IQs between 85 and 100. If you are working with kids in that category and think they are, as a group, genuinely interested and capable in abstractions, then you are fooling yourself. This really isn’t a matter of opinion; the link between IQ and ability to think abstractly is well established. See the Flynn op ed here in this earlier post: https://educationrealist.wordpress.com/2012/10/06/teaching-students-with-utilitarian-spectacles/

      I would argue that even low mid-level ability kids (IQs from, say, 95-105) have difficulty with modeling and abstractions and benefit from more concrete presentations, with modeling presented as a way to make abstract equations concrete situations, but that’s at least debatable.

      Is the problem with teaching modeling (for the sake of modeling) to low-ability kids that they won’t be interested in it? Or is it that it’s a lowering of standards, because the product is (in fact) more important than the process?

      There is no problem with teaching modeling. There is a problem with pretending that creating a model is more important than finding the answer for MOST people, particularly low ability people.

      As I’ve said before, it may be possible to teach low ability kids abstractions in connection with specific skills, but we haven’t tried that yet, and it’s certainly not what math class is about.

      • Michael Pershan (@mpershan)

        Here’s my problem with what you’re saying, and (apologies) but I’m having trouble seeing how you’re clearly answering my objections.

        (1) “If you are working with kids in that category and think they are, as a group, genuinely interested and capable in abstractions, then you are fooling yourself.”

        You haven’t defined abstractions well. All of math is an abstraction of some sort or another. Multiplication is an abstraction. Fractions are an abstraction. Simple equations are abstractions. Which abstractions do you believe low ability kids to be capable of, and which do you not?

        (2) I think that you’ve contradicted yourself. You write that low ability kids can’t handle modeling for the sake of modeling’s sake. You also write that teachers introduced modeling for the sake of modeling because they found that their kids couldn’t handle finding answers (“Then we started pretending that the process was more important than the product…”). So, which is it?

        (3) I think that you’ve tripped up in another way. You write that you’d just as well see low ability kids out of school. That means that school doesn’t have much to offer them. Then you also write that it’s more important to be able to find answers for most people in math. Now, if you’re just taking a stand on what you think dumb people are interested in, fine. If you are saying that it’s more important for people to know how to find answers than to understand the process, then you’ve tripped up. Because nothing ultimately matters about the education you’re providing these kids, as far as content goes. You’re just trying to stretch their minds, not trying to teach them anything that lasts. So who cares what’s “more important”? Nothing is really important.

        Thanks for your patience in explaining your positions.

  • educationrealist

    You can’t understand what “abstract” vs. “concrete” means in this context? Yeah, let’s leave it be. Not much point in any further discussion.

Leave a comment