Problems and Textbooks

If students only solve a narrow range of problems they will not be able to apply their knowledge in new contexts in the future. If my students solve a broad range of problems they are more likely to develop deep knowledge. All my clever demonstrations, cute explanations, and bad jokes matter a lot less than I like to think. What matters is students thinking about mathematical ideas, and the best way to get students thinking is to have them solve problems.

I’ve spent some time over the last few weeks trying to expand my knowledge of problems. Mostly, this has involved finding interesting problems in my textbook and working through them. I use the Larson, Hostetler & Edwards Precalculus textbook sporadically but most of my curriculum is a homebrew. While I enjoy the freedom to teach what I want to teach, I also run the risk of teaching topics in narrow ways based on my knowledge and biases.

There’s a lot of drudgery in textbooks, and some are better than others. I didn’t work through every problem. But flipping to the last page of each section led me to a surprising variety of interesting problems. I found plenty of challenges and new perspectives and saved lots of problems to use in my curriculum. I think textbooks can get an unfair reputation. For the most part they are resources of examples and problems. Examples and problems are the backbone of any math curriculum. Teaching straight from the textbook can be incredibly uninspiring if I just parrot the examples and assign 1-33 odd, but if I insist on inventing everything myself I’m missing the opportunity to save myself effort and expand my knowledge of many topics.

The largest danger of textbooks, in my opinion, is the structure of the text and not the problems themselves. There’s an implied pedagogy in typical textbook design. Start with examples, then assign students some repetitive practice. Fast students might get to some harder problems at the end, but probably not. I want to use some of the new and challenging problems I’m finding in my examples. Rather than only teaching the basics and hoping a few students can figure out hard problems, I can raise expectations by making challenging and unusual problems an explicit part of my teaching, providing students with support, leading discussions, sharing perspectives, and summarizing takeaways. The drudgery of textbooks to me is in the repetition of paint-by-numbers mathematics. I can enrich my curriculum by incorporating variety. It’s all there, it just needs to be structured in the right way.

Cognitive Biases as a Teacher

I love learning about cognitive science and psychology, and applying that learning to my classroom. But I’ve noticed that research often focuses more on understanding the minds of my students than understanding my own mind as a teacher. I’m not sure why this is, but there are plenty of things that I have learned about how my own mind works that influence my teaching. Here are three that I try to think about on a regular basis:

Confirmation bias. Humans don’t like to change our minds or be wrong. But more than that, our minds filter information around us in ways that confirm our previous beliefs. I think about this a lot on two levels. First, I have certain beliefs about what effective teaching looks like. I’m likely to focus on evidence that my pedagogy is working and ignore cases where it isn’t. Second, I make assumptions about my students, and I’m likely to reinforce those assumptions without effort. I need to consciously seek out evidence that disconfirms my beliefs — in this case, evidence that my teaching isn’t working as well as I want to believe, and evidence that my students are not who I assumed them to be.

The curse of knowledge. Humans tend to assume that others have the same knowledge we do, and struggle to recognize the ways our own knowledge allows us to do things. I like to think I know a fair amount about math. But that knowledge prevents me from understanding what it is like to be a student in a math class. This influences my ability to empathize with students, and also to break down content to help it make sense. Having proved lots of trigonometric identities in my life, I am fluent in lots of little skills that I forget I’ve even learned, and when I forget what those skills are I can’t set my students up for success. This means I need to take time to better understand what I already know, and stay open to learning familiar ideas from a new perspective.

Fundamental attribution error. Humans are biased to assume that the behavior of people in front of us represents who those people are. We assume that, if a person says something mean that they are a mean person, or if a person struggles to explain something that they are inarticulate. But the math classroom is only one context, and humans change between contexts. I need to resist the urge to categorize and judge my students, and give them the chance to break those boundaries. I’m often surprised by my students passions and interests outside of math. Many young people don’t like math very much, and for good reasons. I need to recognize when I am extrapolating based on a limited sample size, and seek out opportunities to broaden what I know about my students.

I’m far from perfect, and understanding some of my biases helps me to recognize when I make mistakes and correct them. These three psychological phenomena are always operating in the background of my mind. Better understanding each bias helps me to recognize their impact and correct for their negative consequences.

(Not) Learning From Problems

I love Catriona Agg’s geometry puzzles. I enjoyed playing with this one from two weeks ago:

I had a hard time solving it. I floundered for a while, then used a convoluted strategy involving four equations and four variables. The answer popped out, but when I looked in the replies I found many more concise ways to do it.

Truth is I’m not that great at solving geometry puzzles like this one. I often struggle with Catriona’s problems. I’m not great with common strategies like using similar triangles, or drawing a circle, or lots of other things. And here’s the tough part — as I’ve spent time exploring these puzzles over the last year, I haven’t gotten much better.

I think this is a useful example of the difference between solving a problem and learning math.

Catriona writes great problems, but the problems aren’t designed to teach me new things. They aren’t sequenced to build on each other, there aren’t opportunities to practice key ideas, there’s no summary or discussion of takeaways for a learner like me. It’s hard to learn solely from problems, especially on a topic I’m not already good at. I might come across an interesting principle in some problem, but I’m cognitively overloaded while trying to figure it out. By the time I figure out whatever the idea is I’m ready to move on to something else and forget what I might have learned. It’s not a recipe for durable learning.

If I want students to learn from problems I need to embed time to process and codify learning, practice opportunities, and opportunities to transfer the mathematical ideas to new contexts. The role of a teacher is to do all of that — to figure out what we want students to learn and get students thinking about those ideas. That’s different from having students solve a single problem and hope the learning sticks.

I don’t mean to be critical. I don’t think Catriona is trying to teach things on Twitter, only to have fun and share the beauty of math. But exploring her problems has been eye-opening for me as a novice. I love moments like this to better empathize with my students’ experiences in math class. Learning is hard, and it’s easy to forget that when I’m teaching the same topics and same problems I’ve used for years.

What I’ve Learned From #DisruptTexts

I’m hesitant to wade into the #DisruptTexts debate. I’m a math teacher, what do I know about choosing texts to read in English classes? But I’ve learned a lot observing the important work of questioning the traditional literary canon. I am sad to see bad-faith attacks that mischaracterize what #DisruptTexts is about, and I’d like to offer my perspective on the movement.

First, I want to engage with a legitimate argument for the canon. I read The Crucible in school. It explores a literal witch hunt, written during the era of McCarthyism in the United States. The play gives context to a historical era that I hope we can learn from. The play also gives context to the phrase “witch hunt.” I have a better understanding of that phrase and its implications because I’ve read The Crucible. One important aspect of this type of learning is that it’s often implicit. When I hear the phrase “witch hunt” I don’t immediately think of John Proctor, but my knowledge still helps me to better understand the world around me. This is only one example. There are other places in the traditional canon that build useful knowledge. And that’s the argument for the canon: these texts have stood the test of time, form a foundation for what it means to be educated, and provide access to cultural references.

There’s a caricature going around that #DisruptTexts is about banning books and throwing out the canon entirely. That’s not what I’ve observed. Instead, #DisruptTexts is about interrogating what students learn from the texts they read, and making informed decisions about what they should read and how they should read it. For instance, the website has a great article discussing The Crucible. Reading that article I learned a lot that I didn’t notice when I read the play in school. The play deals in stereotypes and elevates some perspectives at the expense of others. So while I learned useful lessons about witch hunts, the text also reinforced stereotypes and tired narratives about good intentions that also impact how I see the world today. A lot of that learning is implicit, but still shapes the way what young people learn in school. #DisruptTexts isn’t about banning or censoring. It’s about unpacking the lessons students learn from texts, teaching traditional texts with a critical eye toward those lessons, and replacing others with new, valuable perspectives.

The Crucible offered me one useful lesson, but those lessons aren’t unique to the traditional canon. In the last few years I have developed a deeper understanding of police violence by reading The Hate U Give, a deeper understanding of the complexities of immigration by reading Exit West, and a deeper understanding of prejudice by reading the Broken Earth trilogy. The canon doesn’t have a monopoly on important knowledge or important learning. That’s why I support the work of #DisruptTexts. I read far too many white authors and traditional narratives when I was in school. More diverse voices and perspectives would have enriched my education and broadened my world, and I’m still doing work to play catch-up.

So where does math come into this? I think that the #DisruptTexts folks are way ahead of any comparable efforts in the math community. The closest thing to a canon in math class is probably our race through algebra to calculus. There are lots of types of mathematical thinking, and we choose to value complicated symbol-pushing and abstraction as the end goal of high school math education. That’s a choice — there are lots of other directions we could head. We could choose statistics, probability, mathematical modeling, number theory, computer science, data science, and more.

Why do we teach algebra? There’s an argument for it, definitely. It’s the foundation of the math we use in disciplines like engineering. But there are also arguments against it. What I love about #DisruptTexts is the dialogue and community. They create space to have hard conversations around what texts to read, how best to read them, and what they want students to learn. Those are conversations I wish we had more in the math education community. Too often math educators see curriculum and standards as static, taking what we teach for granted and trying to figure out the best way to teach within those contraints.

To be fair, there are plenty of efforts heading in this direction. NCTM released Catalyzing Change, which addressed many of these themes, and plenty of schools are having similar conversations. But they haven’t percolated to the surface the same way #DisruptTexts has. And I admire the depth and sophistication of the conversations I see around the texts teachers use in English class. We need to ask some of those same questions. What do students actually learn from algebra – not what we wish they would learn, but what they actually learn? What knowledge do students use implicitly, without realizing they are using it? When do they use that knowledge? Which pieces are useful, and which are worth scrapping? What do we most want students to be able to do with math outside of the math classroom? To what extent do we teach students that they are bad at math? Would that change if we changed what we taught? What other implicit lessons do we teach without realizing it? What is the hidden curriculum of math class?

These questions and more are worth asking. And again, I know many teachers ask them every day. But #DisruptTexts provides a model for what it looks like to build a community around asking hard questions, and engaging in dialogue about those questions. I think we have a lot to learn, and I’m very grateful to the #DisruptTexts folks for offering a model that we can learn from.

A Small Change

When I teach rational functions, I always use this task after students have gained some fluency in graphing simple rational functions:

Last basketball season, [student name] made 21 of her first 30 free throws, and then went on a hot streak and made every single free throw after that. Write a function for her free throw shooting percentage as a function of shots taken (after the first 30).

What do the horizontal asymptote, y-intercept, x-intercept, and vertical asymptote represent in this situation? On what domain does this function make sense?

(credit to Rachel who I originally stole the problem from)

The problem has a few other fun extensions. I can give students new functions to interpret and describe what they say about a basketball player’s shooting skills, get into the weeds of domain and range, throw in a problem about field goal percentage and average points per shot, or more. The problems aren’t anything that special. I’m sure many teachers use similar problems in their classes. But I’ve found this sequence useful for starting interesting discussions, getting students to engage with applications of a topic that doesn’t have very many applications, and interpreting a complicated graph in context.

This year when I taught this problem I made one small change. I gave them the initial problem, to write free throw percentage as a function of shots taken. But I also offered a hint: shooting percentage = shots made / shots taken. In the past when I’ve used this problem, some groups figure out the function quickly and others get stuck because they don’t know where to start. And that might be fine if my goal was for students to learn about percentages and writing functions based on proportions. But the goal of this sequence of questions is to connect representations of rational functions. While I’d love students to be proficient at writing functions like this, it’s not a skill that comes up very often and it’s not my main focus. I’d rather help students write their function successfully, and have them spend more time trying to figure out what the different parts of the function represent in context.

It’s a small change, but I’ve found myself making changes like this more and more often. I wrote about this idea earlier this year under the title “More Explicit.” A lot of teachers pushed back. At times there’s an orthodoxy in math teaching that struggle is good and the teacher’s job is to be less helpful. I think this lesson is a good example of what I mean when I say that my teaching has become more explicit. I don’t mean that I’m doing everything for students. But struggle is best in small doses, under the right circumstances. Too much struggle leaves students feeling dumb. I’m making a lot of small changes like this one to focus struggle on the most important parts of a problem and to focus student thinking on the most important mathematics. There are a lot of fun math tasks in the world, but some take students on long and frustrating detours. That can be fine. But it’s also fine to be a little more explicit to help more students get where they’re going.

Clever Problems

Here’s a clever problem:

I love clever problems. I spend time trying to write new clever problems to use with my students. But I’ve stopped putting clever problems on summative assessments.

Here’s why. I think there’s a healthy instinct among teachers to use clever problems on assessments. If we ask the same predictable problems we always do students are just regurgitating formulas. Asking about concepts in novel ways requires students to apply what they know in a new context, and assesses whether they actually understand the math they’ve been learning. But it’s easy to cross the line from just clever enough to too clever. Summative assessments are stressful for students, and encountering an unfamiliar problem only increases that stress. I’m not a great judge of difficulty; plenty of problems I think will be easy are hard, and many more I think will be hard turn out to be easy. By giving students clever problems on summative assessments I introduce all kinds of subjectivity and uncertainty — and that same subjectivity and uncertainty means I’m not getting nearly as much evidence out as I would like.

Assessment questions should have two characteristics. First, a student who knows the concepts should reliably do well. Second, a student who does not know the concepts should reliably do poorly. I don’t think the question above meets either of those criteria. I’m sure there are teachers out there saying, “well that’s not a great question because of this and that.” And you’d be right, it’s not that great. But how sure can you ever be that your clever questions are actually assessing what you think they’re assessing? Every clever question I write is like a little childI bring into the world, and I struggle to see their flaws.

I’m not saying teachers shouldn’t use clever questions. Only that clever questions don’t belong on summative assessments. I use them as tools for discussion and to extend student thinking in class. I use them in no-stakes formative assessments as a window into student thinking. But these are very different contexts from an assessment at the end of a unit that has a large influence on students’ grades.

This means my summative assessments often feel boring and predictable, and get a little more boring and predictable each year. I’m happy with that. While I might like to pretend otherwise, grades matter. Students deserve not to be tricked when their grade is on the line. I do lots of weird stuff in my class, I promise. I’m not saying math class should be boring. But giving a regular old boring unit assessment isn’t a big deal. If my students were getting every question on my assessments right I might change my tune. But they’re not. I have a lot of work to do as a teacher; finding clever ways to assess students is not the best place to spend my energy.

One final note. I have had students appreciate my clever assessment questions in the past, and I’m sure other teachers have had the same experience. Some might defend their assessments because students seem to like them. But we need to ask ourselves: which students like these questions? In my experience it’s the loud students who like our classes no matter what we do. They’ll be fine. The students who don’t appreciate clever assessment questions are the ones who are less likely to speak up, who are already frustrated, and who already feel like math isn’t for them. There’s nothing wrong with writing a boring, predictable summative assessment, and boring predictable summative assessments might be just what struggling students need.

Ambiguities and Arbitrarities

I’m introducing integration to my calc students right now. It’s my sixth year teaching calculus, and I’m convinced that the notation we use for integration is a terrible for beginners. I’ve only convinced myself I understand it because I have spent so much time using it. I’ve forgotten what it’s like to be new to integration, and I ignore all the confusing bits.

Seriously. What does dx mean? Really, what is it? A little bit of x? How much? How can I multiply a function by dx? Where does the dx go when I integrate? How can plugging in two numbers account for all the area between them? Wait, what is negative area? But negative area becomes positive when those numbers are in the other order? How is this the same as a Riemann sum?

I understand that calculus teachers reading this post have some way of explaining what integration is and why it makes sense. I’m not looking for better explanations. I’m trying to find better ways to validate my students when they ask these questions, and when they feel dissatisfied with my answers.

Here’s the thing. The most successful students in this part of a calculus course are often the students who can avoid asking what all that notation means and just plug through the algebra. And I think that sucks. I want students to ask questions and to be curious. I want to validate that curiosity. I have a hard time validating curiosity about integration.

And all this trouble reflects the history of calculus. Ben Blum-Smith’s brief but hilarious play “Honor Your Dissatisfaction” gets at many of these themes. For centuries — yes, literally centuries — mathematicians used the machinery of calculus without really understanding why it worked. It just worked, and that was good enough.

I see this same phenomenon, though to a lesser degree, elsewhere in the math curriculum. We like to pretend sometimes that math is purely logical and is some bastion of reason and rationality. But the reality is that there are many more cracks in that artifice than we like to talk about. Notation is ambiguous. Convention is arbitrary. I paper over the inconsistencies so students can compute correct answers, and I worry that I squash curiosity for expediency.

I want to find a balance here. Math is ambiguous and arbitrary in lots of ways. Spending too much time emphasizing those ambiguities and arbitrarities risks sending a message to students that math is worthless and designed to confuse teenagers. But engaging honestly with these tensions validates the student experience. I know I can do more to help students see their struggles as a legitimate part of the learning process, rather than a reflection of their own shortcomings.

Followup: On Student Feedback

A few folks responded to my last post arguing that teachers need to balance listening to students’ feedback with pushing students outside their comfort zone. I definitely agree! In general, novices aren’t great at knowing when they are learning or what is best for their learning. I can’t listen blindly to what my students want.

But I also think it’s important to be careful here. That logic can take me to an extreme where my pedagogy is unfalsifiable. If students like it, I’m doing it right. If students don’t like it they just don’t know better.

I also think that, on close reading, my students’ feedback is really insightful and specific. “I know that I don’t fully understand most of the topics we have touched on.” “Can you explain the process of getting to answers a bit more.” They are recognizing what they don’t understand and asking for more help, and I want to validate that as an effective learning strategy.

I framed my original piece around the idea of being “less helpful.” That’s still a value of mine in my teaching. I don’t want my students to be dependent learners who can’t use resources around them. I don’t want my students to be passive in math class. But there are lots of ways to reach those goals. I don’t have to throw out all of my teaching. There are lots of small changes I can make. I make small changes in my teaching all the time. And listening to students when they tell me something isn’t working for them is a great time to tinker a little bit.

The final point a few folks made in response to my post is the value of transparency. In teaching we often ask students to do counterintuitive things to help them learn. Learning is much more complex than students often realize. The more I can be transparent with students about why I’m asking them to do what we do in math class, the more buy-in I’m likely to have.

NCTM Virtual Conference: Beyond Consensus

I presented at the NCTM Virtual Conference last week, a session called “Beyond Consensus: Rhetoric and Reality in the Classroom.” I won’t try to recreate the whole presentation here, but I want to share one story and one idea I tried to communicate.

I often try to be “less helpful” in my teaching. It’s an idea I see advocated in different forms at conferences and in the online math-ed-world. For me, it means that I want students to be able to look to each other as resources, believe in their own ability to figure things out, and become more independent learners.

I’m teaching in person this fall, and I gave my students a short survey asking them how the fall was going and their feelings about math class. Here are two excerpts from two different students’ responses:

AHHHHHHHHHHHH. Can you explain the process of getting to answers a bit more.

I think that it would be really useful to have some times where you are the one demonstrating and doing problems on the board and not just entirely us figuring it out because I think you do very well at catering towards one way of learning but not everyone learns that way and I know that I don’t fully understand most of the topics we have touched on.

These are both insightful pieces of feedback for me. My takeaway is that I need to rethink some of my less helpful classroom structures to support these students and make sure I’m teaching the learning skills I want them to develop. It would be easy for me to blame the students here — to tell myself that they just aren’t used to this type of teaching, or they need to be willing to take more risks, or something else. But I want to hold myself accountable to listen to students and valuing their ideas in class. That means taking their feedback seriously even when it goes against some of my values as a teacher.

The idea I tried to share in my presentation is being student-centered, rather than pedagogy-centered. I try to be a student-centered teacher. By that I mean I want to value student ideas, respond to student thinking, and adjust my teaching based on how students respond. But in trying to be student-centered, sometimes I get stuck on a pedagogy that I think is student-centered — like being “less helpful” — and lose sight of the students themselves. And when I lose sight of the students, it’s a certain group of students who loses out. The quieter students, the students whose parents don’t advocate for them as loudly, the students with less social capital.

I think the way we talk about teaching can lend itself to being pedagogy-centered, rather than student-centered. Conference presentations give you an hour to get your idea across. Tweets and blogs need to be short and pithy. And conversations about the complexities of students are messy. It’s easier to advocate for a pedagogy, and leave the students out of the equation.

I want to be student-centered in my teaching. That means creating space to listen to student thinking, taking that thinking seriously, and using it to guide my teaching. I’ve often left conferences convinced that some new pedagogy is “the thing” only to have that pedagogy collide with the reality of my students and my classroom. There’s nothing wrong with sharing ideas and trying new things. I still want to be a less helpful teacher. But I want to grow my teaching as much by listening to my students as by trying to find the perfect pedagogy. To return to the title of my presentation, I worry that when teachers talk about teaching far away from students, we create a false consensus about what great teaching looks like. It’s easy to get excited about some abstract pedagogical idea. It’s much harder to make that idea work for every student in my classroom.

Types of Problems

Here are four different problems that I’ve enjoyed exploring:

A country road is 27 miles long and goes all the way around a lake, connecting the six cottages that are next to the lake. Two of the cottages are 1 mile apart (along the road). Two cottages are 2 miles apart, two are 3 miles apart, two are 4 miles apart,…,two are 25 miles apart, and two are 26 miles apart. How are the cottages distributed along the road?

Liljedahl, in Building Thinking Classrooms

A palindrome is something that is the same forwards as backwards–like mom, dad, race car, I prefer pi, et cetera. Numbers can also be palindromes–like 141, 88, 1221, et cetera. Now, consider the number 75. 75 is not a palindrome. So, reverse it and add it to itself: 75 + 57 = 132. 132 is also not a palindrome, so do it again: 132 + 231 = 363. 363 is a palindrome. So, we stop, and we say that 75 is a depth-2 palindrome (because I had to do the process twice to get to a palindrome). Find the palindrome depth of all two-digit numbers.

Liljedahl, in Building Thinking Classrooms

There is a mysterious 10-digit decimal number, abcdefghij. Each of the digits is different, and they have the following properties:

a is divisible by 1
ab is divisible by 2
abc is divisible by 3
abcd is divisible by 4
abcde is divisible by 5
abcdef is divisible by 6
abcdefg is divisible by 7
abcdefgh is divisible by 8
abcdefghi is divisible by 9
abcdefghij is divisible by 10

What’s the number?

Conway, via Quanta
Screenshot 2018-11-12 at 8.39.39 AM.png

Singh

I love these problems. In each case, I can start experimenting using only my knowledge of arithmetic to understand how the problem works. They remind me of a distinction between “move problems” and “insight problems” from this paper by William Batchelder and Gregory Alexander. A move problem is clearly defined with a finite number of solution paths that can be systematically explored to find a solution, while an insight problem requires the solver to reconceptualize the problem or look at it from a new perspective in order to find a solution. Move problems lend themselves to trial and error and can typically be solved by computers, while insight problems require a uniquely human approach.

All four are what I would call move problems. Each can be solved using some variation on trial and error. But exploring these four problems helped me to see that the distinction between move problems and insight problems can be artificial. I’ve always tried to give students move problems rather than insight problems, because insight problems can be frustrating for students who don’t “get it” and often seem like they are designed to trick the problem solver. Move problems provide an entry point for each student to get started.

Each problem above has a different relationship with insight. One feels like a move problem while you’re working, but requires the solver to look back at the end and make several insights in order to be confident in the solution. A second can be solved like a move problem, but is much more efficient with a few early insights to make the process more efficient. A third is unmanageable to solve as a pure move problem, but a few insights can reduce it to a manageable size. And a fourth is (at least from my perspective) a move problem in which insight wasn’t very helpful. I’ll leave it to the reader to explore the problems and decide which is which. Have fun!

I’m not sure what the implications are for my teaching. But I love problems, and I love finding ways for students to enjoy solving problems. Better understanding this distinction seems like an important way for me to better understand the problems I give students.