## A Critical Language for Problem Design Saturday, Jan 18 2014

I am at the Joint Mathematics Meetings this week. I had a conversation yesterday, with Cody L. Patterson, Yvonne Lai, and Aaron Hill, that was very exciting to me. Cody was proposing the development of what he called a “critical language of task design.”

This is an awesome idea.

But first, what does he mean?

He means giving (frankly, catchy) names to important attributes, types, and design principles, of mathematical tasks. I can best elucidate by example. Here are two words that Cody has coined in this connection, along with his definitions and illustrative examples.

Jamming – transitive verb. Posing a mathematical task in which the underlying concepts are essential, but the procedure cannot be used (e.g., due to insufficient information).

Example: you are teaching calculus. Your students have gotten good at differentiating polynomials using the power rule, but you have a sinking suspicion they have forgotten what the derivative is even really about. You give them a table like this

 x f(x) 4 16 4.01 16.240901 4.1 18.491

and then ask for a reasonable estimate of f’(4). You are jamming the power rule because you’re giving them a problem that aims at the concept underlying the derivative and that cannot be solved with the power rule.

Thwarting – transitive verb. Posing a mathematical task in which mindless execution of the procedure is possible but likely to lead to a wrong answer.

Example: you are teaching area of simple plane figures. Your students have gotten good at area of parallelogram = base * height but you feel like they’re just going through the motions. You give them this parallelogram:

Of course they all try to find the area by $9\times 41$. You are thwarting the thoughtless use of base * height because it gets the wrong answer in this case.

Why am I so into this? These are just two words, naming things that all teachers have probably done in some form or another without their ever having been named. They describe only a very tiny fraction of good tasks. What’s the big deal?

It’s that these words are a tiny beginning. We’re talking about a whole language of task design. I’m imagining having a conversation with a fellow educator, and having access to hundreds of different pedagogically powerful ideas like these, neatly packaged in catchy usable words. “I see you’re thwarting the quadratic formula pretty hard here, so I’m wondering if you want to balance it out with some splitting / smooshing / etc.” (I have no idea what those would mean but you get the idea.)

I have no doubt that a thoughtful, extensive and shared vocabulary of this kind would elevate our profession. It would be a concrete vehicle for the transmission and development of our shared expertise in designing mathematical experiences.

This notion has some antecedents.[1] First, there are the passes at articulating what makes a problem pedagogically valuable. On the math blogosphere, see discussions by Avery Pickford, Breedeen Murray, and Michael Pershan. (Edit 1/21: I knew Dan had one of these too.) I also would like to believe that there is a well-developed discussion on this topic in academic print journals, although I am unaware of it. (A google search turned up this methodologically odd but interesting-seeming article about biomed students. Is it the tip of the iceberg? Is anyone reading this acquainted with the relevant literature?)

Also, I know a few other actual words that fit into the category “specialized vocabulary to discuss math tasks and problems.” I forget where I first ran into the word problematic in this context – possibly in the work of Cathy Twomey-Fosnot and Math in the City – but that’s a great word. It means that the problem feels authentic and vital; the opposite of contrived. I also forget where I first heard the word grabby (synonymous with Pershan’s hooky, and not far from how Dan uses perplexing) to describe a math problem – maybe from the lips of Justin Lanier? But, once you know it it’s pretty indispensible. Jo Boaler, by way of Dan Meyer, has given us the equally indispensable pseudocontext. So, the ball is already rolling.

When Cody shared his ideas, Yvonne and I speculated that the folks responsible for the PCMI problem setsBowen Kerins and Darryl Yong, and their friends at the EDC – have some sort of internal shared vocabulary of problem design, since they are masters. They were giving a talk today, so I went, and asked this question. It wasn’t really the setting to get into it, but superficially it sounded like yes. For starters, the PCMI’s problem sets (if you are not familiar with them, click through the link above – you will not be sorry) all contain problems labeled important, neat and tough. “Important” means accessible, and also at the center of connections to many other problems. Darryl talked about the importance of making sure the “important” problems have a “low threshold, high ceiling” (a phrase I know I’ve heard before – anyone know where that comes from?). He said that Bowen talks about “arcs,” roughly meaning, mathematical themes that run through the problem sets, but I wanted to hear much more about that. Bowen, are you reading this? What else can you tell us?

Most of these words share with Cody’s coinages the quality of being catchy / natural-language-feeling. They are not jargony. In other words, they are inclusive rather than exclusive.[2] It is possible for me to imagine that they could become a shared vocabulary of our whole profession.

So now what I really want to ultimately happen is for a whole bunch of people (Cody, Yvonne, Bowen, you, me…) to put in some serious work and to write a book called A Critical Language for Mathematical Problem Design, that catalogues, organizes and elucidates a large and supple vocabulary to describe the design of mathematical problems and tasks. To get this out of the completely-idle-fantasy stage, can we do a little brainstorming in the comments? Let’s get a proof of concept going. What other concepts for thinking about task design can you describe and (jargonlessly) name?

I’m casting the net wide here. Cody’s “jamming” and “thwarting” are verbs describing ways that problems can interrupt the rote application of methods. “Problematic” and “grabby” are ways of describing desirable features of problems, while “pseudocontext” is a way to describe negative features. Bowen and Darryl’s “important/neat/tough” are ways to conceptualize a problem’s role in a whole problem set / course of instruction. I’m looking for any word that you could use, in any way, when discussing the design of math tasks. Got anything for me?

[1]In fairness, for all I know, somebody has written a book entitled A Critical Language for Mathematical Task Design. I doubt it, but just in case, feel free to get me a copy for my birthday.

[2]I am taking a perhaps-undeserved dig here at a number of in-many-ways-wonderful curriculum and instructional design initiatives that have a lot of rich and deep thought about pedagogy behind them but have really jargony names, such as Understanding by Design and Cognitively Guided Instruction. (To prove that an instructional design paradigm does not have to be jargony, consider Three-Acts.) I feel a bit ungenerous with this criticism, but I can’t completely shake the feeling that jargony names are a kind of exclusion: if you really wanted everybody to use your ideas, you would have given them a name you could imagine everybody saying.

## Wherein This Blog Serves Its Original Function Wednesday, Nov 21 2012

The original inspiration for starting this blog was the following:

I read research articles and other writing on math education (and education more generally) when I can. I had been fantasizing (back in fall 2009) about keeping an annotated bibliography of articles I read, to defeat the feeling that I couldn’t remember what was in them a few months later. However, this is one of those virtuous side projects that I never seemed to get to. I had also met Kate Nowak and Jesse Johnson at a conference that summer, and due to Kate’s inspiration, Jesse had started blogging. The two ideas came together and clicked: I could keep my annotated bibliography as a blog, and then it would be more exciting and motivating.

That’s how I started, but while I’ve occasionally engaged in lengthy explication and analysis of a single piece of writing, this blog has never really been an annotated bibliography. EXCEPT FOR RIGHT THIS VERY SECOND. HA! Take THAT, Mr. Things-Never-Go-According-To-Plan Monster!

“Opportunities to Learn Reasoning and Proof in High School Mathematics Textbooks”, by Denisse R. Thompson, Sharon L. Senk, and Gwendolyn J. Johnson, published in the Journal for Research in Mathematics Education, Vol. 43 No. 3, May 2012, pp. 253-295

The authors looked at HS level textbooks from six series (Key Curriculum Press; Core Plus; UCSMP; and divisions of the major publishers Holt, Glencoe, and Prentice-Hall) and analyzed the lessons and problem sets from the point of view of “what are the opportunities to learn about proof?” To keep the project manageable they just looked at Alg. 1, Alg. 2 and Precalc books and focused on the lessons on exponents, logarithms and polynomials.

They cast the net wide, looking for any “proof-related reasoning,” not just actual proofs. For lessons, they were looking for any justification of stated results: either an actual proof, or a specific example that illustrated the method of the general argument, or an opportunity for students to fill in the argument. For exercise sets, they looked at problems that asked students to make or investigate a conjecture or evaluate an argument or find a mistake in an argument in addition to asking students to actually develop an argument.

In spite of this wide net, they found that:

* In the exposition, proof-related reasoning is common but lack of justification is equally common: across the textbook series, 40% of the mathematical assertions about the chosen topics were made without any form of justification;

* In the exercises, proof-related reasoning was exceedingly rare: across the textbook series, less than 6% of exercises involved any proof-related reasoning. Only 3% involved actually making or evaluating an argument.

* Core Plus had the greatest percentage of exercises with opportunities for students to develop an argument (7.5%), and also to engage in proof-related reasoning more generally (14.7%). Glencoe had the least (1.7% and 3.5% respectively). Key Curriculum Press had the greatest percentage of exercises with opportunities for students to make a conjecture (6.0%). Holt had the least (1.2%).

The authors conclude that mainstream curricular materials do not reflect the pride of place given to reasoning and proof in the education research literature and in curricular mandates.

“Expert and Novice Approaches to Reading Mathematical Proofs”, by Matthew Inglis and Lara Alcock, published in the Journal for Research in Mathematics Education, Vol. 43 No. 4, July 2012, pp. 358-390

The authors had groups of undergraduates and research mathematicians read several short, student-work-typed proofs of elementary theorems, and decide if the proofs were valid. They taped the participants’ eye movements to see where their attention was directed.

They found:

* The mathematicians did not have uniform agreement on the validity of the proofs. Some of the proofs had a clear mistake and then the mathematicians did agree, but others were more ambiguous. (The proofs that were used are in an appendix in the article so you can have a look for yourself if you have JSTOR or whatever.) The authors are interested in using this result to challenge the conventional wisdom that mathematicians have a strong shared standard for judging proofs. I am sympathetic to the project of recognizing the way that proof reading depends on context, but found this argument a little irritating. The proofs used by the authors look like student work: the sequence of ideas isn’t being communicated clearly. So it wasn’t the validity of a sequence of ideas that the participants evaluated, it was also the success of an imperfect attempt to communicate that sequence. Maybe this distinction is ultimately unsupportable, but I think it has to be acknowledged in order to give the idea that mathematicians have high levels of agreement about proofs its due. Nobody who espouses this really thinks that mathematicians are likely to agree on what counts as clear communication. Somehow the sequence of ideas has to be separated from the attempt to communicate it if this idea is to be legitimately tested.

* The undergraduates spent a higher percentage of the time looking at the formulas in the proofs and a lower percentage of time looking at the text, as compared with the mathematicians. The authors argue that this is not fully explained by the hypothesis that the students had more trouble processing the formulas, since the undergrads spent only slightly more time total on them. The mathematicians spent substantially more time on the text. The authors speculate that the students were not paying as much attention to the logic of the arguments, and that this pattern accounts for some of the notorious difficulty that students have in determining the validity of proofs.

* The mathematicians moved their focus back and forth between consecutive lines of the proofs more frequently than the undergrads did. The authors suggest that the mathematicians were doing this to try to infer the “implicit warrant” that justified the 2nd line from the 1st.

The authors are also interested in arguing that mathematicians’ introspective descriptions of their proof-validation behavior are not reliable. Their evidence is that previous research (Weber, 2008: “How mathematicians determine if an argument is a valid proof”, JRME 39, pp. 431-459) based on introspective descriptions of mathematicians found that mathematicians begin by reading quickly through a proof to get the overall structure, before going into the details; however, none of the mathematicians in the present study did this according to their eye data. One of them stated that she does this in her informal debrief after the study, but her eye data didn’t indicate that she did it here. Again I’m sympathetic to the project of shaking up conventional wisdom, and there is lots of research in other fields to suggest that experts are not generally expert at describing their expert behavior, and I think it’s great when we (mathematicians or anyone else) have it pointed out to us that we aren’t right about everything. But I don’t feel the authors have quite got the smoking gun they claim to have. As they acknowledge in the study, the proofs they used are all really short. These aren’t the proofs to test the quick-read-thru hypothesis on.

The authors conclude by suggesting that when attempting to teach students how to read proofs, it might be useful to explicitly teach them to mimic the major difference found between novices and experts in the study: in particular, the idea is to teach them to ask themselves if a “warrant” is required to get from one line to the next, to try to come up with one if it is, and then to evaluate it. This idea seems interesting to me, especially in any class where students are expected to read a text containing proofs. (The authors are also calling for research that tests the efficacy of this idea.)

The authors also suggest ways that proof-writing could be changed to make it easier for non-experts to determine validity. They suggest (a) reducing the amount of symbolism to prevent students being distracted by it, and (b) making the between-line warrants more explicit. These ideas strike me as ridiculous. Texts already differ dramatically with respect to (a) and (b), there is no systemic platform from which to influence proof-writing anyway, and in any case as the authors rightly note, there are also costs to both, so the sweet spot in terms of text / symbolism balance isn’t at all clear and neither is the implicit / explicit balance. Maybe I’m being mean.

## Dispatches from the Learning Lab: Partial Understanding Monday, Apr 30 2012

So here’s another one that I suppose is kind of obvious, but nonetheless feels like big, important news to me:

It’s possible to only partly understand what somebody else is saying.

Let me be more specific. When you’re explaining something to me, it’s possible for me to get some idea from it in a clear way, to the point where my understanding registers on my face, but nonetheless the other 7 ideas you were describing I have no idea what you’re talking about.

<Example>

I am a 9th grader in your Algebra I class. You’re teaching me about linear functions. You are explaining to the class how to find the $y$-intercept of a linear function, in slope-intercept form, given that the slope is $4$ and the point $(6,11)$ lies on the line. You explain that the equation has the form $y=mx+b$ and that because we know the point $(6,11)$ is on the line, that this point satisfies the equation. Thus you write

$11=4\cdot 6+b$

on the board. At this point I recognize that we are trying to find $b$ and that we have an easy single-variable linear equation to solve. My face lights up and you take mental note of my engagement. Maybe you even ask for the $y$-intercept, and since I recognize that this must be $b$ I calculate $11-24 = -13$ and raise my hand.

Meanwhile, I have only the vaguest sense of the meaning of the phrase “$y$-intercept.” I have literally no understanding of why I should expect the equation to have the form $y=mx+b$. I have a nagging feeling of dissatisfaction ever since you substituted $(6,11)$ into the equation because I thought $x$ and $y$ were supposed to be the variables but now it looks like $b$ is the variable. Most importantly, I do not understand that the presence of the point on the line implies that its coordinates satisfy the equation of the line and conversely, because on a very basic level I don’t understand what the graph of the function is a picture of. This has been bothering me ever since we started the unit, when you had me plug in a bunch of $x$ values into some equations and obtain corresponding $y$ values, graph them, and then draw a solid line connecting the three or four points. Why am I drawing these lines? What are they pictures of?

Occasionally, I’ve asked a question aimed at getting clarity on some of these basic points. “How did you know to put the 6 and 11 into the equation?” But because I can’t be articulate about what I don’t understand, since I don’t understand it, and you can’t hear what I’m missing in my questions because the the theory is complete and whole in your mind, these attempts come to the same unsatisfying conclusion every time. You explain again; I frown; you explain a different way; I say, “I don’t understand.” You, I, and everyone else grow uncomfortable as the impasse continues. Eventually, you offer some thought that has something in it for me to latch onto, just as I latched onto solving for $b$ before. Just to dispel the tension and let you get on with your job, I say, “Ah! Yes, I understand.”

</Example>

This example is my attempt to translate a few experiences I’ve had this semester into the setting of high school. The behavior of the student in that last paragraph was typical of me in these situations, though it would be atypical from a high school student, drawing as it does on the resources of my adulthood and educator background to self-advocate, to tolerate awkwardness, even to be aware that my understanding was incomplete. Still, often enough I ended up copping out as the student does above, understanding one of the 8 things that were going on, and latching onto it just so I could allow myself, the teacher and the class to move on gracefully. Conversations with other students indicated that my sense of incomplete understanding was entirely typical, even if my self-advocacy was not.

The take-home lesson is two-fold. Point one is about the limitations of explaining as a method of teaching. Point two is about the limitations of trusting your students’ (verbal or implied) response to your (verbal or implied) question, Do you understand?

The basic answer (as you can tell from the example) is, No, I don’t.

Now I myself love explaining and have done a great deal of it as a teacher. I fancy myself an extremely clear and articulate explainer. But it couldn’t be more abundantly clear, from this side of the desk, how limited is the experience of being explained to. I mean, actually it’s a great, key, important way to learn, but only in small doses and when I’m ready for it, when the groundwork for what you have to say has been properly set.

I am somewhat chastened by this. I am thinking back self-consciously to times when I’ve explained my students’ ears off rather than, in the immortal words of Shawn Cornally, “lay off and let them fucking think for a second.” It’s like I was too taken with the clarity and beauty of the formulation I was offering, or in too much of a hurry to let them work through what they had to work through, or in all likelihood both, to see that more words weren’t going to do any good. Beyond this, I’m thinking back on the faith I’ve put in my ability to read students’ level of understanding from their faces. I maintain that I’m way better at this than my professors, but I don’t think I’ve had enough respect for how you can understand a small part of something and have that feel like a big enough deal to say, and mean, “Oh I get it.” Or to understand a tiny part of something and use that as cover for not understanding the rest.

## Dispatches from the Learning Lab: Inauthentic Agreement Wednesday, Jan 25 2012

Here’s another one. It should be quick.

When a student says, “Is it like this?” or the equivalent, I used to err on the side of “yes.” I.e. even if I wasn’t sure exactly what they were saying, but I thought it sounded like it might make sense. I think this was somewhat a function of the fact that I adopted a generally encouraging posture (this is my personality but also a deliberate choice), but it itself was just sort of my reflexive response from within this posture (not a deliberate choice).

It never felt quite right, so over time I trained myself instead to say things like, “I can’t understand what you’re saying but I think you might be onto something, but I’m not sure.” I never had concrete evidence that my original response was doing something unhelpful though.

Now I do. In a recent conversation with one of my teachers, several times I said, “Let me explain back to you what I think you’re saying, and you tell me if it’s right…” And he said, “yes yes yes it’s like…” But I didn’t recognize my attempted explanation in what he seemed to be saying yes to. So, it’s official: this is TOTALLY UNHELPFUL. I’m disoriented; that’s why I asked the question. Unless I come away from your answer feeling sure that you understood me, your “yes” only serves to make me more disoriented.

Take-home lesson. Never say “yes” unless you are sure you have understood fully what the student is saying, and agree with it. As I’ve often discussed before, sometimes a “yes” is inappropriate even then; for example if there’s a danger that the student is trying to foist onto you the work of judging for her or himself. But if you have any doubt, then the “yes” is definitely inappropriate: the encouragement is fake, and the student is left being equally unsure as before, and now also having exhausted the resource of checking with you. Retrospectively the only student who even feels good hearing the “yes” in this situation is the one who is playing a Clever Hans game, and in this case it does him or her the disservice of encouraging the game.

## Dispatches from the Learning Lab: Why I Don’t Always Ask My Question Tuesday, Jan 24 2012

One of the many reasons I put myself in a math PhD program is that it is an intense full-time laboratory in which for me to examine my own learning process, and my experience as a participant in math classrooms from the student side. I hope to record many lessons from this laboratory on this blog. Here is one.

As a teacher I have always strongly encouraged people to pipe up when they’re confused, whether working in groups or (especially) at the level of whole-class discussion. To encourage this, I do things like:

* I leave lots of wait time.
* I respond to questions (especially those expressing confusion) with enthusiasm when they are asked, and after they are discussed I point out concrete, specific ways in which the questions advanced the conversation.
* I give (very deeply felt) pep talks about the value of these questions.
* Sometimes I directly solicit questions from people whose faces make it seem like they have one.

I am behind all of these practices. However, in every class that I have taught, whether for students or teachers, including all those of an extended enough length so that the practices would have time to shape the culture, it has always seemed to me that participants are often not asking their questions. This has puzzled me a bit. I’ve generally responded by trying harder: leaving longer wait-time, making more of a point to highlight the value of questions when they happen, giving more strident and frequent pep talks. This hasn’t resolved the matter.

Now I am not about to pronounce a new solution. But I have what for me is a very new insight. I imagine some readers of this blog will read it and be like, “Ben, I could have told you that.” I’m sure you could have, but this wouldn’t have helped me: retrospectively, students have told me it many, many times. But I didn’t get it till I felt it. This is the value of putting yourself in their position.

What I’ve realized since beginning graduate school is that I had an incomplete understanding of why students don’t ask questions. I believed that the only reason not to ask a question is the fear of looking dumb. My approach has been entirely aimed at ameliorating this fear and replacing it with the sense that questions are honored and their contribution is valued.

Now one of the great advantages of going to grad school as an adult, rather than going fresh out of college, is that I have very, very little fear of looking dumb. (In the immortal words of my friend Kiku Polk, you get your “f*ck you” at 30.) To all my early-20′s people: your 20′s will be wonderful but if you make sure you keep growing, your 30′s will be better.

And one of the great advantages of going to grad school after over a decade as a teacher, is that I have a strong commitment to asking my questions, stemming from the value that I know they have both for myself and the class.

Perhaps as a consequence, I found that in all four of my classes last semester, I asked more questions than anyone else in the room.

Be that as it may, I frequently didn’t ask my questions.

What’s up?

There is an added layer that it is often perceptible that the teacher desires for everyone to understand and appreciate what was just said as clearly as she or he understands and appreciates it. Last night I was in a lecture in which I was hyperaware of not always asking my questions, and part of the dynamic in that case was actually the professor’s enthusiasm about what he was saying! I did ask a number of questions, but one reason I didn’t ask more is that I sort of felt like I was crashing his party! My warm feelings toward this professor actually heightened this effect: messing up someone else’s flow is worse when it’s someone you like.

As I mentioned above, students have been trying to tell me this for years. I never got it, because on some level I always believed that the real problem was that they were afraid to look dumb. I remember a conversation with a particular student who was my advisee as well as my math student. When I pressed her on asking more questions in class, she said something to the effect of, “you know, you’re doing your thing up there, and I don’t want to get in the way.” I literally remember the voice in my head reinterpreting this as a lack of belief in herself. Now I think that that was part of it as well; but my response was all aimed at that, and so didn’t address the whole issue.

Now my process of figuring out how to operationalize this new insight in terms of teaching practice has only just begun, and one reason I am writing about this here is to invite you into this process. I am certainly NOT telling you to withhold your enthusiasm on the grounds that it might make kids not want to interrupt you with questions. Furthermore, evidently when I describe experiences from my graduate classes, I am describing a situation in which the measures you and I have been taking for years to encourage question-asking are mostly absent. I doubt most of my professors have even heard of wait time. Nonetheless, I am sure that this new point of view is fruitful in terms of actual practice. Below are my preliminary thoughts. Please comment.

If I want to really encourage question asking, what I have been doing (aimed at building a culture of question-asking) is necessary, but insufficient. It is also necessary to think about lesson structure with an eye to: how do I design the flow of this lesson so that (at least during significant parts where questions are likely to arise in students’ minds) asking their questions does not feel like an interruption? One model, which is valuable in other ways as well, is to have students’ questions be the desired product of a certain segment of class. For example, when the lesson arrives at a key idea, definition, or conclusion, ask students to turn to their neighbors and discuss the key idea and try to produce a question about it. Then have the pairs or groups report their questions. This way, the questions cannot be interruptions because they are explicitly the very thing that is supposed to be going on right then.

I like this idea but it has limited scope because it requires the point in the lesson at which the questions arise to be planned, and of course this can never contain all the questions I would want to have asked. Another thing to think about is the matter of momentum. I think my discussion of enthusiasm above really revolves around momentum. Enthusiasm generates momentum, but momentum is actually the thing that it hurts to get in the way of. Therefore I submit a second idea: the question of managing my/your own and the class’s momentum. Having forward momentum is obviously a big part of class being engaging, but perhaps it also suppresses spontaneous questions? Or under certain conditions it does?

(In a way this reminds me of the tension – one I am much more confident is an essential one of our profession – between storytelling and avoidance of theft – I discussed a particular case of this tension in the fourth paragraph here. Momentum is aligned with storytelling: a good story generates momentum. Avoiding theft is aligned with inviting questions.)

A last thought is that in a class of 20 or 30, having the class engage every question that pops into any student’s head at any time is obviously not a desirable situation. You might think I thought it was desirable based on the above. But the question is how to empower students to ask questions when we want them. I know that I for one have often known I wanted some questions so I could be responsive to them, and they weren’t forthcoming. The question is about how to change this. Part of the answer is about the culture, valuing the questions, encouraging the risks, and making everyone feel safe; but it’s the other part – how to structurally support the questions – that’s the new inquiry for me. As I said above, please comment.

## Still Here, Still Learning Friday, Mar 11 2011

I last posted in October. I wrote a review of Waiting for Superman that generated more traffic than I’d ever seen before on this blog. Since I had been intending to continue my series on the idea of mathematical talent since the summer, I decided not to post again until I was done with the next installment of that series. But because it involves some research, and I care about it a lot and want to get it just right and tend to get kind of obsessive about things like that, and because there’s been a lot of other stuff going on so I haven’t been working on it consistently, this has kept me from posting anything at all for 4.5 months. So maybe it was time to revisit that agreement with myself?

And a few days ago, JD2718 wrote me an email to the effect of, “yo, what happened to you?”

So, here’s a partial answer -

a) I learned a lot about leadership. One of my jobs this year has been to facilitate the weekly math department meeting at a high school, and plan the agenda for this meeting. This has gotten me involved with the communication channel between the department and the principal. I feel really grateful to have had the opportunity to do this. It has caused me to start to develop a completely different skill set than I’ve ever had to use before. (To give you a whiff of what I mean, it inspired the following facebook status: “Ben Blum-Smith thinks it is important to be a straight-shooter and a diplomat, and that you do each better by doing the other one.”)

b) I learned a lot about training new teachers. Another of my jobs this year has been as a faculty member of an MAT program. In the fall, my colleague Japheth Wood and I taught a “math teaching 101″ typed course for our cohort of 12 preservice folks; this winter we taught the “math teaching 102″ installment. They’ve been in apprenticeships for 9 weeks and we’ve just gone through observing them actually teach a few times, so now on my mind is – what am I happy with in their teaching? What’s missing? And what implications does all that have for our fall and winter courses?

c) I’ve continued to design and implement a graduate course on algebra and analysis for the faculty of a high school. This has been both awesome and very challenging. We chose to organize the course to culminate with the Fundamental Theorem of Algebra. At the beginning of the year I thought this was a reasonable goal and the course would not feel hurried. Now, 2/3 of the way in, somehow I’ve found myself feeling pressure to go through significant chunks of material at breakneck speed. That tension is of course absolutely part of the lives of all the participants in their own classrooms, so in a way it’s cool that this is parallel; but still. I am implicitly making a case with this course for the principles of math teaching I believe in, so I’d better be living those principles in my teaching of it. A few of them I feel like I’ve been 100% consistent with:

* Every day I will bring you questions that are worth your time, questions that even I think are exciting to think about even though I already know the content.
* A math course should have a plot, with beginning, middle, end, dramatic tension, resolution. (Math teaching as storytelling.)
* Central to learning math is the interplay between formal/rigorous thoughts, definitions etc. and intuitive notions. I will always stress the connections between the two.

Other principles I feel like I’ve nailed some of the time and totally let slip away other times in my concern to make sure we get to the content:

* Honor your dissatisfaction.
* (Closely related) The most powerful certification of new knowledge is consensus of the learning community, the same way new knowledge is certified in the research community.

3 classes ago I had them prove the irrationality of $\sqrt{2}$, spent the whole period on it, left them all the heavy lifting, noticed and brought out points that were bothering people, and generally aced these last two principles. The last two classes have felt the opposite way. I think I was talking 80% of the time in the most recent class. Lots of questions never got answered because they never got aired; lots of productive thoughts never got formed because they never had time to. Anyway, getting this course right will continue to be an engaging challenge.

d) I applied to doctoral programs in math. Now I need to decide where to go. The choices are NYU, CUNY and Rutgers. I feel very excited and torn.

e) If anybody remembers the ellipse problem that Sam Shah brought back from PCMI, and which I wrote about back in August… Japheth and I have completely solved it. I am going to tease you with this tidbit and not the solution itself because we wrote a manuscript on it which we hope to get published.

f) Okay this doesn’t fit under the rubric of “what happened to me” but here are some links you might enjoy:

* A Teacher Story by Anna Mudd. Anna’s blog, Drawmedy, is a beautiful kind of writing which I won’t try to describe. It’s not an education-themed blog so I was delighted to see her take on her experience as a teacher.

* This gem from Vi Hart: Wind and Mr. Ug

* Taylor Mali’s What Teachers Make. This poem is definitely amazing, and if you’ve never seen it, I think you won’t be sorry if you watch it before reading the next sentence. <pause>Pause while you watch the video.</pause> It brings up some ambivalent feelings in me too – these are a story for another time, but here’s the short version: It’s related to the tone of the current national conversation about education, which is all about how the incompetent slovenly dumb*sses in front of our children are f*cking everything up. In this context, Mali’s piece is an eloquent testament to the value of our work, but it also makes me uncomfortable. Mali appears to have been amazingly happy with the job he was doing as a teacher when he wrote and performed this. But I don’t think that (especially in light of the current climate of the conversation) feeling like you’re doing an amazing job should be in any way a requirement for testifying to the value of your work; especially since most of us do not feel that way, most of the time.

* Speaking of the current national conversation about education, a new study by the National Education Policy Center came out on New York City’s charter schools, which are often touted as models for the nation.

* It’s weird to experience yourself as an unwitting participant in a historical zeitgeisty trend, but I do. I have the strong feeling that the traditional distance between the mathematics education community and the mathematics research community is closing, and I, a classroom teacher and teacher trainer entering into a math PhD program, am like completely an example of that. Another is the latest issue of the Notices of the American Mathematical Society, which is the research community’s professional association. It is devoted to education. You can download it for free.

(Thanks, JD2718, for making me write all this.)

## The Talent Lie Monday, Aug 9 2010

Back in the fall when I was a baby blogger I wrote a discussion of Carol Dweck’s research about intelligence praise. I did this because I think this research is intensely important. However, I didn’t really let loose on the subject with the full force of what I have to say about it. The truth is I was shy, because a) I’d just had a kind of frustrating conversation on the subject with Unapologetic at Jesse Johnson’s blog, so I was wary of being misunderstood, and b) more embarrassingly, I was excited by the positive response to my previous post about Clever Hans and I didn’t want to alienate any of my new audience.

Now I am a toddler blogger. My godson, with whom I spent the day a few weeks ago, is an actual toddler.

He is profoundly unconcerned with anybody’s opinion of him, and just blazes forth expressing himself (climbing on things; coveting whatever his big sister is playing with; being turned upside down as much as possible) all day long. I am going to take this as inspiration, and commence a series of posts about the idea of “math smarts” and talent and intelligence more broadly. These posts have two central contentions:

1) People constantly interpret mathematical accomplishment through the lens of math talent or giftedness.

2) This is both factually misleading and horrible for everyone.

Tentatively, here is the table of contents for this series. I may edit these titles, add or remove some, and I’ll add links when I’ve got the posts up. But here’s the plan for now:

I. Why the talent lie is a lie; how to understand math accomplishment outside of it
II. How the talent lie is spread (in pop culture, and inside the discipline of mathematics)
III. How the talent lie hurts people who are “good at math”
IV. How the talent lie hurts people who are “bad at math”
V. How to train students to understand math accomplishment outside of the talent lie
VI. Why the talent lie is so entrenched, even though it is stupid and harmful

I should make more precise what I mean by “the talent lie.” It’s really several variants on a fundamental idea. People who are really good at math must have been born with a gift, for example. That they must be extra smart. That being good at math (or not) is something that doesn’t change over time. That being smart (or not) doesn’t change. In short, that your intellectual worth, and the worth of your engagement with the field of mathematics in particular, is an already-determined quantity that’s not up to you. That’s the talent lie.

Some examples of the talent lie at work:
* Any time anyone has ever said, “I’m bad at math.”
* Just about any time anybody makes a big deal about the age by which a young person does something intellectual. (Starts talking, starts reading, starts learning calculus…)

(In that last bullet, the “just about” is there only because of the theoretical possibility that a big deal might get made for a reason other than to prognosticate about the person’s ultimate intellectual worth.)

I give you these examples to show that I am not talking about a fringe, outmoded idea but something very mainstream. I will have much more to say about how the talent lie is manifested in the forthcoming posts.

I expect to spend a long time writing them. This project may take all fall year the next several years. I believe the message I’m communicating is vital for our field and important more broadly as well. It’s also a very personal message. Like all urban educators and all math teachers, I have a lot of first-hand experience with the damage that the labels “not smart” and “not good at math” can inflict. But I am also speaking as someone who spent my early years being seen by others, and regarding myself, as mathematically gifted. This was a heady and thrilling thing when I was in middle school, but I became vaguely aware of the complications by the end of high school, and with hindsight it’s clear that it left me with baggage that took a decade of teaching, learning and introspection to shake. So my own journey is a big part of the story I’m telling here.

I will save the detailed analysis for the forthcoming posts, which means that I am going to defer a lot of clarification and answering-questions-you-might-have for later. But I would like now to articulate in broad terms what I believe needs to change.

According to the Calvinist doctrine of unconditional election, God already decided whether you are going to be damned or saved, and did this way before you were born. Nothing you can do – not a life of good acts, not a wholehearted and humble commitment to acceptance or faith – can have any effect. The most you can do is scan your life for signs of God’s favor, and read the clues like tea-leaves to see if you are chosen or cast away. Modern American culture doesn’t buy this doctrine from a theological point of view, but is 100% bought in when it comes to math. When a person performs mathematically, we obsessively look at the performance, not on its own terms, but as a sign one way or the other on the person’s underlying mathematical worth, a quantity we imagine was fixed long ago.

We need, as a culture, to gut-renovate our understanding of what’s going on when we see people accomplish impressive mathematical feats. Likewise, when people fail at mathematical tasks. We need to stop seeing people’s mathematical performance as nothing more than the surface manifestation of a well-spring of mathematical gifts or talent they may or may not have. Relatedly but even more importantly, we need to stop reading the tea-leaves of this performance to determine these gifts’ presence or absence. This whole game is bunk.

Not only is it bunk but it’s a crippling distraction, for everyone – teachers, students, parents, and our culture as a whole – from the real job of studying, wandering through, becoming intimate with and standing in awe of the magnificent edifice known as the discipline of mathematics.

When you step to the gate and present yourself before it, math doesn’t give a sh*t about the particular profile of cognitive tasks that are easy and hard for you at this moment in time, and you shouldn’t either. There are institutions that are very keen to divine from this profile your worthiness to enter, but this is the curtain they hide behind to make themselves look bigger than they are. It’s time to tear that curtain down.

More on its way. In the meantime here is some related reading:

* I Speak Math recently tackled this same subject. I plan on drawing on some of the research she links.

* Jesse Johnson and I had a conversation about this stuff close to a year ago, and she wrote about it here and here. I’ll go into much more detail on these themes in the coming posts.

* While not as credentialed, the Wizard of Oz nonetheless has a fair amount in common with wolverine wranglers. See if you see what I mean.

## What’s In the Way of Making Students Prove, part II Friday, Apr 9 2010

I am continuing my thoughts about Kate’s provocative comment regarding why it’s hard to ask students to prove something significant. Kate said:

Or I do say something like “Why should that have to be true? Can we come up with some kind of explanation?” But they have no idea how to even start and it feels unfair and scary to ask them to.

As I said last time there are two big questions here – (1) why can’t they start? and (2) why does it feel unfair and scary to ask them to? Last time I talked about (1), so now -

(2): why does it feel scary and unfair to ask students (say, in 10th grade) to seek a justification?

Here I have two ideas to offer:

a) Time pressure.
b) The Contract. (Which is turning out to be my favorite idea from that article by Patricio Herbst I wrote about last time.)

(a) Bob and Ellen Kaplan talk in their book Out of the Labyrinth about the need for a sense of relaxedness or leisureliness around time in order to run their math circles. Learning proof is the same thing. Creativity can’t be rushed. Any time I’ve successfully gotten an individual or a class to prove something at all difficult, without me intervening to suggest key ideas, the one constant has been that it took longer than expected and we had that time to give it. Every time, afterward I felt sure that the time was much better spent than it would have been any other way. But if in any way I feel a pressing need for the topic or question to get wrapped up, this makes it practically impossible for me to perceive and do what needs to be done to support the class in creating the proof without giving them too much help. It has to be okay for people to sit there stumped for a while and not make any progress. It has to be okay for people to take what I know is a wrong turn and find out it’s a wrong turn on their own. And I have to be listening to the conversation the right way – listening for the direction the ideas are taking, the obstacles coming up, and searching for the least obtrusive possible thing I could say to be helpful when they grind to a halt. If time pressure is also buzzing in my head I can’t listen that way.

I’m now getting into touchy territory but because of my conviction that the most powerful math teaching I’ve ever done has been outside of time pressure, I’ve started to believe that we as a profession need to actively fight to keep our curriculum and standards not just unreasonably overloaded but actually kind of slim and when they’re not and we can’t do anything about it, to actively prioritize depth over coverage in the face of them. I have the luxury of not having a full-time classroom job anymore, so I recognize that my point of view about this is kind of facile. On the other hand, this year I have been working as a coach in a middle school/high school, and it’s clear to me in that context that everybody’s need to get through the whole curriculum is directly at odds with their desire to do something really substantive with any one of the many topics they have to cover. So I’ve been making the case that we need to choose what the most important topics are, and feel the license to: a) treat them very slowly, b) aim for the students to prove the main results, and c) take as long as that takes.

I’ll illustrate one negative effect of time pressure with the second half of the story about my tutoring client I began to tell last time. Fresh on the heels of her triumph in creating and justifying an algorithm for factoring, I botched the next move, violating my own repeated advice to you. I was getting ready to go, but I was excited about her accomplishment, and I wanted to show her its power, and I also didn’t want her to be left with the impression that every monic quadratic can be factored over the integers. So I said something like, “You’re now in a position to prove that
$x^{2}+10x+7$
can’t be factored into linear factors. How?” She did it, but it took several minutes and I ended up leaving late. Most of the time was spent in a back-and-forth in which I repeatedly realized she wasn’t sure what I was going to consider as an answer to my question. Also, she first offered that it couldn’t be factored because 7 was prime, in spite of the fact that earlier she had factored at least one trinomial that had a prime constant term. (This is a classic case of what’s wrong with a “prove X” problem – by telling her it was unfactorable, I unplugged her from her own logical process to determine whether or not it was.) I feel totally sure that if I’d just said, “Factor
$x^{2}+10x+7$,”
she would have taken the same few minutes realizing it was a trick question, and then she would have said, “you can’t do it.” And when I asked her how she knew, she would have provided a totally coherent proof on the spot, on her own.

Now, why didn’t I just say this to begin with? Well, I was getting ready to go – I just had two or three minutes. In that context, the good pedagogical move – the trick question, asking her to factor the irreducible trinomial – didn’t feel fair. Of course, the actual time we were supposed to end wound up being irrelevant, because I stayed late to clean up the mess I’d made by being in a rush. So my sense of being in a hurry didn’t even get us done faster.

I think this is illustrative. It certainly illustrates a dynamic I’ve been part of often enough. You feel the clock or the calendar. In that context you feel like it would be unfair to put the students in the position to struggle for a long time with no guarantee of when they’ll find what you’re contemplating asking them to look for. So you push through it, and do the heavy lifting yourself, or leave it undone. But if the topic has any subtlety (for example, if students proving something is an object), this isn’t good enough. Often enough, you end up backtracking and reteaching and losing the same time you would have lost by doing it right in the first place, and you’re in a rush because you’re now behind, so you still don’t do it right. And by “you” I mean “me.”

Anyway, again this is the perfect segue –

(b) When I read that Patricio Herbst article, I was so irritated by the theory-heavy style that I cringed at every “theoretical construct” he introduced. (Goodness sakes, how did I survive college as an anthropology major?) This caused me to miss, on the first two passes, that one of them is awesome. Namely, the “didactical contract.” (But can I skip the “didactical” please?) He has got me thinking:

Whenever something feels unfair, I should be asking -

“What is the unspoken agreement between me and my students according to which it is unfair?”

And, once I have an answer –

“Do I like this agreement or do I want to change it?”

The case at hand is Kate’s scenario – you’ve just had kids explore an object and you’ve succeeded in getting them to notice a pattern and make a conjecture about it; you’ve asked them if they can account for the pattern and they are stumped; and asking them to stick with this question feels unfair and scary. What is the unspoken agreement making it unfair?

Okay this is where if you have ever been in this situation you write a comment.

I have been in this situation, so here’s mine:

Reflecting now on past experiences of having this feeling, the common theme is that in one form or another, I’d promised them success if they do what I say. Over and over, I’d reassured them that “all you have to do is do the work and you’ll a) learn the content and b) get a good grade.” The way this promise played out day-to-day was a more immediate promise that if they actually applied themselves to any given task I assigned them, they’d conquer it. In this context, the thing that makes asking them to stick with the task of explaining the pattern they’ve observed feel both unfair and scary is that it violates my promise! The fact is that if the pattern has some subtlety, it’s conceivable that they’ll all sit there forever, apply themselves diligently, and never “in a zillion years” (to quote Kate) come up with a worthwhile explanation for it.

Once this is clear, I have to ask what my reasons were for making this promise. Well, that’s simple. Every full-time classroom job I’ve had has been in an urban public school environment where it was quite hard to get the majority of students to do the work in the first place. Since getting them to do the work was obviously the first step in getting them to learn anything, it seemed totally logical to make this one act the sole key to success. How natural does it feel to take a student who’s got serious questions about the whole ‘school’ enterprise and say, “look, all you have to do is do the work and you’re golden”?

Illustratively, the one class I taught in which it was not a struggle to get students to do the work was AP Calculus, and that was also the course where I felt the most license to give them a really bad*ss open-ended, maybe-nobody-will-get-it type of problem. (E.g., check it: let b take every value from 0 to 6, and draw each line segment in the first quadrant that connects (0,b) to (6-b,0). The union of these line segments is a region bounded by the axes and a very attractive curve. Find the area of this region.) Obviously the “AP” in the title gave me this license; but the truth is that this goes hand-in-hand with the fact that in that class, there was much less of a reason for me to communicate the message that all you have to do is the work.

Anyway, retrospectively I think this contractual agreement (in all classes but AP Calc) cost me more than it bought me. It put a cap on the amount of creativity I could ask of my students, and ultimately, engaging with math creatively is what makes it rewarding. For example, it was hard to ask students to prove something subtle. I now believe that the tasks that I avoided because they felt unfair are actually central to kids achieving the type of learning I want for them. So what ended up happening was that I violated my promise anyway. Even if you did the work, it wasn’t a guarantee that you’d learn what I wanted you to learn.

I think the issues I faced are pretty general, but my big point here is not the specifics of these issues, but just the question – if a task feels unfair, what is the unspoken agreement (the Contract) making it unfair? And is this contract worth it?

Addendum, Saturday April 10, 7pm:

I didn’t mean to end on such a down note. I actually think this reflecting-on-the-contract thing creates some really powerful and exciting opportunities for us. Actually, a cool project that I invite any of you, and myself, to take on, is to write down the ideal contract between us and our students, and then make it explicit with our students. Some inspiration:

*As Kate E says in the comments, Sam Shah describes in great detail an awesome occasion where he explicitly revised the contract he had with his calculus class, and then the new contract took effect, and the instruction felt powerful and new.

*JD2718 is constantly writing about his teaching in a way that leaves me impressed with how much intention and attention he has given to the contract between him and his students. For example, his students don’t necessarily expect him to tell them the truth:

Also, for those of you who like this sort of thing, two groups finished in what I considered too short a time, so I lied and told them I thought that there answer was too small. Now, they know I lie, but they also know that I know a lot, so they have become more used to responding, “we think we are done because….” which I consider a good thing. I don’t want them to stop because I say enough, but rather because the mathematics suggests that they have finished.

## What’s In the Way of Making Students Prove, part I Wednesday, Apr 7 2010

In response to my last post, Kate Nowak raised the most important set of issues I can think of:

I will absolutely stipulate to all of this:

This is why insisting on too much formality too early is bad for people who are learning how to prove… If someone is insisting on formality from you when you don’t have any reason to doubt something less carefully argued, you will get the idea that proof has nothing to do with what makes sense to you, what you find convincing. But you can’t produce a proof without being guided by this.

All of this adds up to the case I’ve made before, that saying “prove that such-and-such is true” is the wrong problem… The “contract” that says we are supposed to give them a chance to work on “proof” as opposed to something else. If they also have to figure out what is even true, that could feel like we’re asking them to do more than just prove something. The problem is that they will never learn how to prove something if we don’t ask them for more.

But I need some SERIOUS training in how exactly one goes about teaching that way. I wasn’t taught that way and none of my colleagues teaches that way. Sometimes I feel like I get close, because I make the kids investigate and measure and conjecture (today, for example: median of a trapezoid), but then I stop before asking them to prove it. Or I do say something like “Why should that have to be true? Can we come up with some kind of explanation?” But they have no idea how to even start and it feels unfair and scary to ask them to. It would not occur to them to draw a picture and extend the legs and think about similar triangles, in a zillion years.

Thank you Kate for getting this conversation started. There are 2 very big questions here:

1) Why don’t they know how to start?
and
2) Why does it feel unfair and scary to ask them to?

These questions are bigger than me, but here are my 2 cents. I’m posting my thoughts on question (1) now because I wanted to get this up. I’ll post on question (2) tomorrow or Friday.

I have 3 thoughts to offer about why they don’t know how to start.

a) Inexperience on the students’ part.
b) Failure of the question to hook into the natural processes students need to use to actually prove things.
c) The vicious cycle.

(a) A big reason they don’t know how to start is that in most cases, no one has asked them for this type of thing before (at least not with any follow-through – more in (c)). If they’re in high school, they’ve had a long time to formulate an idea of what is going to be asked of them in math class, and typically this isn’t it. The content for which they’re being asked to seek justification has 10 years of increases in sophistication since kindergarden, but most students’ development in terms of the creative act of seeking justification is still at the kindergarden level. The later the first occasion when the students are asked to find justification, the harder it will be for them.

People who have been consistently made to justify their mathematical beliefs for a long time know how to start. I know this from private tutoring. If I have sufficient time with a kid, I have the luxury of requiring them to find a reason for every piece of content they learn in school. (I acknowledge readily that this is a luxury, and I only have it if the parents and student have made a sufficient investment of time in tutoring.) The task quickly ceases to be disorienting when it’s required every time they learn any fact. In a recent comment JD2718 wrote about the need to “play math” (meaning, in context, seeking explanations and counterexamples for patterns in numbers) well before a proof course. This is the same point. Seeking explanations and counterexamples is the main activity of research mathematicians. You can’t go from 0 to 60 on this practice; you have to start out slow and ease into it.

Consequently I think it’s totally essential that justification infuse math learning from K on up. (As I said elsewhere, I think that the commutativity of multiplication should be treated as a major theorem needing a thoughtful proof.) This is going to require some major PD for elementary teachers; I actually would love to run some of this PD. Anyway, the fact that this is not the current state of the art adds up to big problems for high school teachers who try to do something more authentic and creative with proof after at least 9 years of schooling during which a typical student has never or hardly ever been asked to come up with a reason to back up their belief. (At least, not with follow-through – again, more in (c).) Why would it ever occur to them to extend the legs? Think about similar triangles? The act of coming up with a proof is essentially creative. You don’t just get creative on the spot in a domain in which you’ve never created before.

(b) Another variable is the way the question is posed and the expectation for what will count as an answer. Especially when you’re first learning about proof, a request for justification has to hook into some natural processes in order for you to respond to it effectively. If you’re an experienced prover, you can hook a problem into these processes intentionally, but when you’re starting out, you can’t. The problem has to be posed just-so to make your natural processes connect to it.

Here’s what I’m getting at: seeking an explanation for something that puzzles you is a natural act. Accounting for your belief is another one. Outside of math class, if you assert a belief and somebody says, “why do you think so?” you’ll probably answer the question fluently. Coherent or not, you won’t be disoriented by the question – you’ll have something to say. And if something is actually puzzling you, you are actually slightly irritated until you’ve made the whole thing resolve itself in one way or another. (Some of us are in the habit of shrugging and going “well, that’s just a mystery of life.” This is one way to make it resolve. Even so, I don’t believe this is anyone’s first response to puzzlement, at least in non-academic contexts, unless there is a lot of weed involved.)

The act of mathematical proof is supposed to plug into these two natural processes. They are what give you both motivation for sticking with the question, and direction in searching for an answer. But in math class, justification questions often fail to hook into these processes. The big example is what I keep harping on: if the problem is “prove X,” and you’re new to this game, you already know X because the question implicitly told you it. Your “this is bothering me” process is missed entirely since there’s no uncertainty anywhere, and your “why do you think so?” process can only be honestly answered by “you told me to prove it,” which doesn’t count as a proof. It’s better if you didn’t start out knowing the right answer, but even here, the question can fail to connect to these processes. For example, suppose you are measuring something, and you notice a pattern, and you make a conjecture about it. If you’ve never seen a pattern in math class hold for 5 cases and fail later, then all you have to do is notice the pattern and you’re satisfied. Again, nothing is bothering you because you don’t have any uncertainty. Meanwhile, the honest answer to “why do you think so?” is “it worked 5 times.” This is one reason why it’s important for teachers at all levels not to act like something is established fact once it has been noticed as a pattern, and why I’m still hoping people contribute to my earlier call for problems that involve a pattern holding for the first few cases and failing later. (Eventually if I get enough stuff I’ll put the brainstorm in a single post. Thanks to all who have contributed so far.)

The question is most effective if it taps these processes. One way to do it: pose a problem – not a proof problem, just a figure-it-out problem – that the students don’t already know a process for, but that’s easy enough for them to find a solution. Then pose another problem that can be solved the same way. Then another one. Pretty soon the students have developed an algorithm. Now, the question “how do you know that works?” taps the “why do you think so?” process. The students do think so, for some sort of mathematical reasons, because they themselves devised the algorithm, in response to the original problem. They can come up with a good answer. I had a great time doing this very thing with a tutoring client two weeks ago. She’d just learned how to FOIL. I posed to her some simple factoring problems, such as
$x^{2}+5x+6$
They were all reducible monic quadratics. She came up with one of the standard methods, totally on her own: “This really isn’t that hard. You just think of all the numbers that multiply to the last number, and you see which ones add to the middle number.” I asked her why that would work and she didn’t miss a beat, since the whole thing was her idea in the first place. She was a total Clever Hans when I started working with her two years ago. Yes, I’m proud of both of us. I botched the very next move, though – more in the forthcoming followup post.

To tap the other process you have to generate some sort of cognitive dissonance for the students. Ideally, I would like my students to experience cognitive dissonance the minute they see a pattern that is not yet explained – why (the hell) is this happening? In my experience it helps to have this attitude myself. (E.g. sometimes I act kind of paranoid when something happens 2 times without being proven, and increasingly agitated if it happens a 3rd time.) But this is only cultivated over time. To generate cognitive dissonance in students who don’t already care about justification, they have to see something happen that contradicts their intuition. JD2718 recently made a recent suggestion that strikes me as along these lines (I’ve never tried this one myself):

“9 has 3 factors (1, 3, 9). 6 has four (1, 2, 3, 6). So, even numbers have even numbers of factors, and odd numbers have odd numbers of factors. Right?”

This conclusion would appeal to the intuition of just about every student I’ve taught. I’d probably have to prod them to even get them to test another example because they’d already be convinced. If I do get them to try another one, I’ll make sure the first one I ask about fits the pattern (say 10, or 25) – so they’ll be even more convinced. In that context, the first counterexample they come across is going to bother them. Then the “what (the hell) is going on here?” process is engaged.

At any rate, these two processes, or the like, are needed to orient students as they try to prove things. I think a big part of why students flop when we ask them to justify is that the question fails to hook students into these processes.

(c) One final thought about why they have a hard time proving: there’s a vicious cycle at work here. They already do have a hard time. That means we all know that if we try to get them to do it, it’s going to take a really long time, or they’re going to fail miserably, or both. In that context, it’s very hard to take this on. It’s even harder to follow through and really make sure it happens, and not to cop out in some way (for example, as in the article I wrote about last time). But this just exacerbates the situation I described in (a) above. Often the students in front of us are so inexperienced not because nobody has ever contemplated trying to get them to prove anything before but because many people have contemplated it and then opted not to, or tried it and then given up. Ultimately the task felt unfair in some way. This is the perfect segue into question (2), which I’ll engage in a followup post in the next day or two.

## Annotated Bibliography on Proof, part I – The Double Bind Friday, Mar 19 2010

In an earlier post I made a brief reference to an article about proving that I had just begun to read. I’ve now read it.

Patricio Herbst, “Engaging Students in Proving: A Double-Bind on the Teacher” in Journal for Research in Mathematics Education, Vol. 33, Number 3, May 2002

It’s available from JSTOR but I couldn’t find it for free.

This is a very thought-provoking article. I have to confess I also found it quite frustrating stylistically. I had to read the whole thing, underlining and scribbling notes in the margins, reread several sections multiple times, and then reread the whole thing start to finish to feel like I fully understood what the man is saying.

I am going to attempt to distill the arguments in a concise and readable way. Afterward, I’ll provide some commentary.

Herbst’s argument

Herbst analyzes a classroom episode he observed and videotaped, of an advanced 9th grade integrated math class, in which the class tried to create a two-column proof of a simple proposition about angles. He sees the episode as “a typical case of a teacher engaging students in proving.” (p. 185) The proof had been assigned as an exercise for homework the previous night and the students had struggled with it. The teacher asked the class to construct a proof together; they made a little progress but then floundered. They began, alternately, to offer unproductive variants on the premises, and to assert the conclusion without adequate justification. The teacher intervened and suggested the key ideas to make progress. She framed her interventions as advice for future situations. Afterward, the students complained that the proof had been too hard, and the teacher said something to the effect that she wouldn’t assign too many problems like that on a test.

Herbst argues that:
a) When the teacher and students undertook the proof exercize as a class, they entered into a sort of “contract” where they were each expected to do a certain kind of thing.
b) The specifics of this contract placed conflicting demands on the teacher, and also prevented the students from taking certain steps that were needed to complete the problem.
c) This explains why the teacher intervened to give the key ideas in the proof, why she framed her interventions as she did, and why afterward, the students saw the proof as particularly hard, and why the teacher granted this.

More broadly, Herbst is making a case that two-column proof exercises tend to be doomed to lead to this sort of situation. Even more broadly, that any task that attempts to isolate proof as the skill being learned is similarly doomed. In other words, that in learning proof, form can’t be separated from substance. Learning about proof has to be integrated with learning the actual mathematical content.

Here is how his argument goes:

1) In the class, “proof” was understood as “transforming premises step-by-step into a conclusion by a sequence of logical deductions, with a reason cited for each step in the sequence.” With this understanding, the teacher’s job on the given occasion was to provide the students an opportunity to “do a proof,” and the students’ job was to “do a proof.” Thus the teacher and students entered into a “contract” with the following clauses:

Teacher’s Job: I’ll give you a task in which the hard part, the part that requires your thinking, is organizing the information into a chain of logical deductions. The task will be fair, i.e. you should be able to do it.

Students’ Job: We’ll take the givens and transform them step-by-step into the conclusion.

2) In order to do her job according to the contract, the teacher picked a task with the following features:

*it explicitly asked for a proof
*the proposition to be proved was stated at the outset
*the “givens” were identified at the outset
*the number of concepts involved was small
*a diagram was provided with a labeling that supported reading it a certain way

(Here is the task: the problem; the diagram. This is on p. 183; the problem came from Integrated Mathematics 2 by Rubinstein et. al., published 1995 by McDougal Littell.)

The task was designed to try to keep the students’ work limited to formally organizing ideas into a proof. Its design overtly (e.g. via the statement of what was to be proved) and covertly (e.g. via the labeling of the diagram) suggested the ideas that the students were supposed to use, so they wouldn’t be responsible for coming up with these ideas and could concentrate on the formal task of organizing them into a proof.

3) In spite of the way the task was set up, the students floundered when they tried to do their job. One way to explain this is that the intended proof had two points where it required students to do something other than transform the givens step-by-step. One was to glean information from the diagram that was not explicitly stated in the givens (specifically, that angle ABC is composed of, so its measure is the sum of the measures of, angles CBD, DBE, EBF, and FBA); another was to write down an equation based on this information and manipulate it algebraically. Students did not produce these moves; this could be a product of the fact that, adhering to the letter of the contract, they thought they were supposed to just keep looking at the givens and transforming them, so these more flexible, broader-scope moves didn’t occur to them.

4) When these difficulties emerged, the teacher was in a double bind. On the one hand, if she intervened and said “you may need to look at these broader types of moves,” she would be giving away too much information; if the students then succeeded in completing the proof, it wouldn’t feel like they really did it themselves (more or less, this is what actually happened). On the other hand, if she didn’t intervene and allowed the students to keep working as they had been, the proof might never get completed. They would probably be left with frustration and the impression that the task was too hard. It would seem as though the teacher hadn’t done her job according to the contract.

5) The teacher dealt with this situation by intervening and framing her intervention as “advice for future proving endeavors.” By doing this she transformed the activity from “an opportunity for students’ joint production of a proof” to “an opportunity to learn a strategy for future use in proving.” (p. 199) This validated the time spent, but the students still didn’t really do the proving.

Herbst makes the case that the dynamics at play here are going to be at play any time the formal skill of “doing proofs” becomes its own curricular goal. Here is his general arugment:

6) As long as the goal is the formal skill of “doing proofs,” the teacher needs to create proof tasks where the only thing the student is asked to do is organize ideas into a logical proof. If the task also requires the student to creatively generate the ideas for the proof, then the task isn’t fair – a kid might be perfectly able to organize a logical argument, but still fail, because they couldn’t think of these ideas. So the teacher needs to relieve the student of the burden of generating ideas for the proof by overtly and covertly giving these ideas to the student ahead of time. (For example, by stating the proposition to be proved at the outset, and/or by providing a diagram labeled a certain way.)

7) But if the task is organized in this way, then students are likely to fail to pick up on the covertly-suggested ideas, or correctly use the overtly-suggested ideas, because they are under the impression that what they are supposed to do is logically transform the premises step-by-step into the conclusion, but really they are also supposed to pick up on all these secret cues the problem is giving them. The form of a proof inevitably depends too much on the specific content for the students to be able to produce it just by trying to transform premises. When this breakdown happens, the teacher is in a double bind: s/he can’t intervene in the middle of the process to direct students to the hidden ideas without making them feel like they didn’t really do the work; but s/he can’t let them just continue barking up the wrong tree till kingdom come.

Herbst concludes that learning the skill of “proof” can’t be separated out from the mathematical content being proven. “Any sort of tools and norms that teachers can use to engage students in proving must allow room for the teacher to negotiate with the class what counts as proof in the context of the investigation of specific, substantive questions.” (p. 200)

My thoughts

A) The classroom excerpts in the article felt extremely familiar (both from classes I’ve taught and watched). In particular, the way the initial progress ground to a halt and then students either just transformed premises unhelpfully or stated the conclusion without justification.

Also familiar from my own practice, teaching a “proof” unit in an Algebra I class in 2001-2004, was the dynamic where you search specifically for problems where the ideas in the proof aren’t too hard to come up with, so students can concentrate on the task of organizing the proof and not get tripped up on generating these ideas. And how problematic this becomes if the students don’t come up with the ideas even when you tried to make them easy. At a few points while reading I felt confused by Herbst’s contention that choosing an exercise that “made the substance of the expected proof available to students” (p. 191) was something the teacher was doing in order to fulfill her “contract” – I mean, it didn’t work, right? The kids flopped anyway! So what good is it doing the teacher? But then I realized that I knew exactly what he was talking about from my own experience trying to teach a “proof unit.” I remember thinking thoughts like, “this problem doesn’t require too many difficult insights, so it’ll be good for getting them to focus on the logical structure.” Of course they flopped anyway with me too, and so it didn’t do me any good. But the important thing is that I felt compelled by that thought while choosing the problem. The fact that I turned out to be totally wrong doesn’t mean that I hadn’t used that consideration. It amounted to me choosing problems specifically to avoid asking my kids to have to demonstrate any creative insight.

With hindsight this seems like an insane way to teach proof now. Talk about playing small. Coming up with creative ideas is what makes proving things fun. How dare I claim to be teaching proof and deny kids the opportunity to touch the most exciting part of it? Of course, the alternative, to ask kids to actually develop creative justifications, will never work either if proof is ghettoized into a unit of Algebra I or Geometry. (I’ve been doing math for 9 years without being asked to generate a single original idea, and now all of a sudden you want me to be creative? WTF?) Looking for proof should be a daily or weekly part of every single math class, all the way down. (But it shouldn’t be too formal too soon. More on this in (B).)

B) I buy Herbst’s case that the kids’ overly rigid understanding of what they were supposed to be doing (“transforming the premises step-by-step into the conclusion”) probably contributed to their difficulty. But I have a lot more to say about what was in their way. The real problem the kids were having is what I’ve talked about before: in the class session described in the article, there is absolutely no connection between the kids’ attempts at proof and their actual, real-live sense of what they know and don’t know. In fact, these two things have been forcibly, violently separated. That’s why the can’t do it. To get concrete, here’s an excerpt (Andie is the teacher):

“Andie then asked, ‘What are we trying to prove?’ – a question that students could answer but then were unable to offer further ideas. So Andie asked, ‘How are we going to get to FBD? Do we know anything about FBD?’ A student’s response, ‘Maybe it’s a right angle,’ led Andie to ask whether the students knew ‘what makes up FBD.’ The answer, ‘FBE plus EBD,’ justified by the ‘whole and parts postulate,’ started a small discussion as to whether they were then entitled to say ‘ABF plus EBD equals ninety degrees’ or whether they ‘[didn't] know it’ yet.” (p. 184)

ARE YOU KIDDING ME? In this classroom, as described, “we don’t know that yet” clearly doesn’t mean we don’t know it yet. Everybody in the room in fact does know that ABF plus EBD is 90, because they were told to prove it (well, the equivalent), so it must be true. Not only that but it looks in the diagram like it’s true. When the students say “we don’t know that yet,” what they mean is “we don’t know if the mysterious authority that certifies that proofs are correct is going to allow us to claim that yet.” What they don’t realize (why should they? it’s been forcibly hidden from them) is that the only mysterious authority that should be involved is their own sense of conviction.

After all, the ultimate source of mathematical authority in the world is the collective conviction of mathematicians. That’s why our standards of rigor have changed so much over time. Calculus was practiced for well over a hundred years before Cauchy bothered to prove its central theorems from a (fairly) precise definition of limits. It wasn’t until the late nineteenth century that Dedekind saw a need to provide a rigorous construction of the reals from the rationals, in order to be able to prove theorems about the reals (such as the fact that they are complete).

Historically, what has pushed the mathematical community’s sense of rigor forward is not an insistence on greater rigor from an outside source, but the encounter with new ideas and examples that caused a crisis of knowledge. Hippasus’ proof that the diagonal of a square is incommensurate with its side, when all the Pythagoreans thought all lengths were commensurate, led to Eudoxus’ theory of proportions. The rigorization of calculus (beginning with Cauchy) was driven by the recognition that Fourier series didn’t behave the way anybody expected them to. (At least, according to this awesome book by David Bressoud.)

How people learn about proof and rigor is the same whether we’re talking about a class or about human civilization. We learn to prove by being challenged to convince ourselves of things. We learn rigor by encountering examples and ideas that throw our assumptions into question. If you believe something, but then somebody (like your teacher) whose authority you trust says to you “you don’t know that yet,” you are stuck. Your mathematical soul is at an impasse. You believe it but you believe somebody telling you you don’t know it yet. You get the message you can’t trust your own reasoning. And in this state you will never, ever produce a coherent proof. If there’s a gap or a flaw in your reasoning, then to grow what you need is to be shown this flaw on your own terms. To see an example that contradicts your assertion; or to hear a counterargument that debunks yours – but not from the mouth of an authority you would dare trust without thinking through it.

This is why insisting on too much formality too early is bad for people who are learning how to prove. The need for formal rigor has to be earned through crises of knowledge. If we want our students to develop an appreciation for a formal proof we have to show them counterexamples to arguments they produced less carefully. If someone is insisting on formality from you when you don’t have any reason to doubt something less carefully argued, you will get the idea that proof has nothing to do with what makes sense to you, what you find convincing. But you can’t produce a proof without being guided by this.

All of this adds up to the case I’ve made before, that saying “prove that such-and-such is true” is the wrong problem for students first encountering proof. The minute you say it, they know it’s true; and this gets in the way of their natural mathematical reasoning process giving them a readout on what’s true. Herbst is arguing that we go for problems like this because of the “contract” that says we are supposed to give them a chance to work on “proof” as opposed to something else. If they also have to figure out what is even true, that could feel like we’re asking them to do more than just prove something. The problem is that they will never learn how to prove something if we don’t ask them for more.

C) Herbst writes that, in the episode he analyzed, the “contract” forced the teacher to make sure the proof was completed. If the class had not been able to complete the proof, then the students might have interpreted their difficulty as meaning that the problem was too hard for them. (Of course this happened anyway.) Herbst says that the fear of this possibility would have made the teacher feel pressed to intervene. (p. 195)

I’m not sure that the teacher’s sense that she needed to intervene to move the process along was based on this consideration, but I suspect he’s right that it was forced by the implicit “contract.” Most math classes that I have observed or taught make sure any question the class works on jointly is resolved that period. In this way we create the expectation that if a class undertakes a proof (or any other juicy math task) together, the teacher will make sure it gets finished. This trains the kiddies to be uncomfortable with irresolution, and quite possibly to experience a class segment that ends on a note of irresolution as the teacher’s failure to do her “job.”

Of course, the discipline of mathematics is rife with irresolution. Questions posed, worked on, and not yet answered, are its lifeblood. They are fertile. Recently I’ve been reading about the history of algebraic number theory, and I learned that it was Ernst Kummer’s work on Fermat’s Last Theorem that led not only to the entire field of algebraic number theory, but to the notion of an ideal that is totally fundamental to ring theory.

I don’t have a straightforward conclusion to draw but given the fertility of irresolution in the history of mathematics, doesn’t it make sense to think about how to make our students more comfortable with it?

* * * * *

This is the first of what I hope will be a series of posts that engage with provocative pieces about proof. But I have to put this off for a month or two and switch gears. For the group theory class I’ve been teaching, I have been researching the history of the theory of equation solving, from the ancient Babylonians thru the birth of Galois theory. So my next post is going to be about two landmark algebra textbooks: Muhammad ibn Musa’s Compendium on Calculating by Completion and Reduction (the original Arabic name for which is totally the origin of the word ‘algebra’ – how awesome is that?), and Girolamo Cardano’s The Great Art which is the first published discussion of the general solution of cubic equations. I’ve read the first of these, just begun the second, and am totally excited to tell you about them. (In a week or hopefully not more than two.)

Next Page »