I wrote a few days ago about Desmos and task propensity. I’m interested in critically exploring the pedagogy and sequencing behind using Desmos activities effectively.  One challenge is the idea of task propensity: when presented with a conceptually-oriented task, teachers and students often focus too much on the thinking that will help them solve the specific task, rather than thinking that will help them connect what they are doing to new problems in the future.  If students are laser-focused on finding the solution to a problem, they are likely to lose the forest for the trees and miss opportunities to generalize their thinking.

Before I dive deeper into this idea, I want to explore the opposite perspective. Humans inevitably focus on the task at hand and push to the background thinking about how they will use it again in the future. This isn’t a problem unique to Desmos activities. And, if it’s in some ways inevitable, we might as well make the tasks that students are focusing on as rich, engaging, and meaningful as possible.

We talked a bit during the Desmos fellows weekend about ways that technology can help and hinder learning. There are lots of ways it can hinder learning — by isolating students, by presenting new distractions, etc. There are also lots of ways it can help learning — it can give immediate feedback, differentiate by providing multiple entry points and tiered challenges, and etc. But the heart of what I find most valuable in this technology is that I can give students richer tasks that create richer thinking with technology than without.

The phrase “rich task” is often thrown around but under-specified. I’m not sure I have a great definition, but I’d like to offer a few examples of what can make a Desmos activity a rich task.

Match My Parabola

Here is a screen from the Match My Parabola activity:

I’m not sure I can even offer an alternative that is equivalent. Here, students can experiment and instantly see what graphs their equations produce. At the same time, once they figure out how to transform the quadratic function vertically or horizontally, they practice that transformation once more in a slightly different location. That sequencing — experimentation until students find success, and then immediate practice — is pretty hard to replicate elsewhere. Desmos does it smoothly and seamlessly.

Game, Set, Flat

Here is a screen from the Game, Set, Flat activity:

Students have been introduced to the challenge: they need to be able to tell good tennis balls from bad tennis balls. They just saw a few examples of balls bouncing, with varying levels of bounciness, to help illustrate the difference. Before students work formally with equations, they have a chance to get a firmer grasp on the principles involved by choosing how high a ball bounces after each bounce. When they click the button, they see their model animated. It animates whether they create a close approximation of a bouncing ball or something silly and unreasonable. But this intermediate step helps students to visualize the problem and sets them up to make connections between equations and the objects they represent. Most importantly, this is something that is impossible to do without digital technology; this whole step is skipped in a pencil-and-paper lesson, and students miss out on the chance to do this thinking.

Burning Daylight

Here are two consecutive screens in the activity Burning Daylight:

It’s easy to stay focused on the abstractions and equations when engaging with mathematical modeling. The first question would, in many paper-and-pencil lessons, be one that students rush past on the way to writing a quick equation and moving on. In this case, the media allows immediate feedback contextualizing the student’s thinking on the previous screen and reinforcing a connection between model and world that might otherwise be lost.

Summary

These are great tasks. And they’re great tasks because they have the potential to create rich thinking for students, in ways that aren’t possible in other formats.

One challenge that I’m interested in exploring is this idea of task propensity: to what extent do students, while working on these activities, focus on the tasks themselves without stepping back to consider how they can use what they’ve learned in new contexts in the future? That said, even if students have a hard time slowing down to make connections I would like them to make, the thinking that they need do to complete these tasks is richer, deeper, and more varied than the best replacements I could offer.

I want to offer this mostly as a check for myself. If I want to take a critical perspective on Desmos activities, I want to make sure I’m clear on exactly what they offer, what they are being compared to, and what my alternatives are.

This task propensity entices teachers and textbook authors to capitalize on procedures that can quickly generate correct answers, instead of investing in the underlying mathematics while accepting that fluency may come later.

(source)

The article linked above is a thought-provoking perspective on why some conceptually-focused math reforms have been unsuccessful. The authors explore the idea of task propensity, or the tendency of teachers and curriculum writers to focus on features of specific tasks rather than  the underlying mathematics that may be used in new tasks in the future. Teachers may have great, conceptually oriented tasks that can elicit mathematical thinking, yet if they only focus on teaching students how to solve those specific tasks that thinking is unlikely to transfer to new problems down the road.

I’m hanging out with some great folks at the Desmos fellows weekend, and I’d like to share two contrasting cases:

Case 1
We spent some time yesterday mingling and doing math together. I spent much if working on this problem from Play With Your Math with a great group of teachers.

I won’t spoil it; this is absolutely worth exploring, and after what was probably an hour of work I have plenty more to learn. The most important feature of my learning was that, in a relatively short period of time, the group I was working with established the answer to the question as it was posed. We then went further, and explored different conjectures and directions to extend the problem. The vast majority of our learning came after we had solved the problem, and depended on our interest in creating new problems to further our thinking. In other words, we avoided the temptation of task propensity to fixate on the problem at the expense of additional learning.

Case 2:
I have really enjoyed both playing and watching students play Marbleslides lessons like this one. Students have to transform various functions in order for the marbles to get every star when they are launched.

This is one of my students’ favorite things to do in class, and is far more engaging for them than any other lesson I have on rational functions. At the same time, I find that students often learn less than I would like from the activity. They spend most of their time focused on the task at hand — getting all of the stars — and less on what I want them to learn — general rules for transforming rational functions. This is not to say that no learning happens, just that students can fall victim to task propensity and lose the forest for the trees.

I am looking forward to my Desmos fellowship and what I will learn from a great group of teachers and the stellar folks at Desmos. One of the important questions I have is around when Desmos is the appropriate tool to use, and when other tools will work just as well or better. One challenge I have with many activities is task propensity; that, while Desmos is a powerful tool for generalizing thinking, that generalization does not happen if students are too focused on the specific features of a task to make connections to broader mathematical ideas. I hope to do some writing over the next few months to explore this idea and try to better understand when Desmos is the right tool, and how to use it effectively.

Understanding Abstractions

This doesn’t feel true about mathematics. Much of the math I teach I would enjoy going down a similar rabbit hole with students, though it hopefully wouldn’t take as long.

But this comic also made me think about calculus. There are plenty of gaps in my calculus understanding — I’m not sure I could prove the product rule without some significant help, for instance. I’ve worked through proofs of Lagrange error before but I’m a long way from really understanding how that whole thing works. Not to mention the Fundamental Theorem of Calculus, which I can use pretty fluently yet don’t particularly understand why it’s true.

Maybe this is a reminder to deepen my own content knowledge. At the same time, my instinct is that there are times when it’s appropriate for a tool to remain an abstraction. I would like to verify that abstractions work — for instance, use Desmos to verify that a few product rule applications do, in fact, produce appropriate derivatives. I wonder if I could come up with criteria for when an abstraction is far more useful than understanding why that abstraction is mathematically correct.

On Teaching Collaboration

I’ve seen the phrase “The four Cs” thrown around more and more recently — critical thinking, communication, collaboration and creativity. 21st century skills, etc etc. I’m going to zoom in here on collaboration.

I want my students to be more thoughtful and effective collaborators when they leave my class. What I know less about is how to structure experiences that will teach students to do so.

Students should collaborate, sure. I find purposeful partner and group work to help students better learn math. But does it also help teach them to collaborate?

I find it interesting that teachers, who are seemingly charged with teaching collaboration, work in a profession where there is little sustained collaboration between colleagues in many schools. In reflecting on my experiences working in groups it seems I have learned much less than I would like to think about collaboration, the general, all-purpose skill, and more about collaborating with those specific people in that specific context.

This person has great ideas but when I ask him to write something up it will take two weeks and three reminders, I should just do it myself. This person is great at taking the student perspective and thinking through how it will impact their experience; make space for her to share before we make any decisions. This person gives excellent, honest feedback; even when it stings, I know that it comes from the right place and is on the mark. Those are the types of lessons I think I’ve learned from my experiences working collaboratively.

I’m sure I’ve learned broader skills of collaboration along the way. My point is just that my practice of putting students in groups “because they need to learn how to collaborate” is probably insufficient to meet the goal. Seems likely that humans learn collaboration like any other skill: practice and reflection. Plenty of practice, spaced over time, and reflection that is mindful of how lessons learned may apply in new contexts in the future.

Now to figure out how to do that.

Doing Math This Summer

Summer is here (apologies to those final folks who are still in school). Every summer I try to spend some time doing math, focused on challenging myself and learning in ways that will support my teaching. I’ve got four different ways I’m doing that this summer. No big commitments for me, but instead a few different avenues to explore and learn when I have the time and inclination.

Exeter Problem Sets
The Exeter curriculum is online and free. It starts with the upper-middle school math that leads into Algebra I, and continues through calculus while exploring some lovely math along the way. Everything is problem-based, and the curriculum is really just problem sets that build high school mathematics piece by piece. Every time I have dug into the Exeter sets I have learned new math, made new connections, filed away different problems for future use, or had some inspiration about how to better sequence and teach the ideas in my curriculum.

Park City Math Institute Problem Sets
The PCMI problem sets are similar, but are designed for math educators. They require little prior knowledge to dive into and explore some fascinating math topics, while also providing a great opportunity to play with ideas, make connections, and discover. This summer’s sets are being posted one day at a time on this website, and prior years can be found on the same site.

Brilliant 100 Day Challenge
I had never heard of the Brilliant website before, but they are posting one challenging problem a day over the summer for 100 days. The problems are excellent, and I’ve had fun exploring some other ideas on their site as well.

GDay Math
James Tanton’s GDay Math site  has a few different courses to work through.  I have previously explore Exploding Dots and did quadratics; in both cases I learned for more than I expected about topics I thought I already new. He also offers courses on fractions, combinations and permutations, and area.

Happy mathing!

Reflecting On Writing

The school year has ended and I’m on to summer vacation. I’m working on a few projects, some related to teaching and some not. I’m also thinking about next year — my priorities, my goals, and my commitments.

In that reflection, I realized that one commitment I haven’t even considered ending was this blog. Writing has become a central part of my identity as a teacher. I think things through in writing. I encounter a challenge in the classroom and start thinking about how I can write about it. I set goals for what I want to learn, and writing about that process holds me accountable and helps me cut through to the essential takeaways.

At the same time, I’ve built a written record of how my ideas have evolved over time. I’ve become someone who can sit down and write when I need to — I no longer put off writing tasks as long as possible. Writing has opened doors and created relationships in my professional life that I never thought would be possible.

Using Feedback From Student Surveys

Since grad school, I’ve been using the same set of student survey questions to assess my teaching. The questions are linked here, though I have since moved them into a Google Form.

These questions are drawn from the Measures of Effective Teaching project, funded by the Gates Foundation. They write in their preliminary report about the design of the student survey:

The goal is not to conduct a popularity contest for teachers. Rather, students are asked to give feedback on
specific aspects of a teacher’s practice, so that teachers can improve their use of class time, the quality of the
comments they give on homework, their pedagogical practices, or their relationships with their students.

I think this is a great premise. If I ask students whether they liked my class or how good my teaching is, their answers are likely to be heavily influenced by whether or not they like me and how they are feeling that day. If I ask more specific questions about my instruction, I’m more likely to get useful and objective information about my teaching.

The survey above selects a subset of questions from the original study and were adjusted to use Likert scales. I’ve stuck with them because they are the questions I used first and I’ve found it helpful to gather comparative data over time.

It’s often a bit of a blow to my ego to hear what students have to say. At the same time, comparing different groups of students has led to valuable insights. For instance, at my current school, I initially did poorly on the question, “Our class stays busy and doesn’t waste time.” I got a 3 from my first cohort, which is an average of “Sometimes”. I then bumped up to a 3.4, then a 3.7, then a 3.9, which is nearly an average of “Usually”. This slow but measurable improvement in one aspect of my classroom management has been gratifying. This is not to say that one survey questions should define my teaching. There’s no reason I should expect a perfect score on that question, and I could do well and still not do much to support student learning. But I haven’t changed the fundamentals of my pedagogy in that time, and my students’ perception is that I have used their time more purposefully. For another perspective on the relationship between using student time effectively and classroom management, check out Matt Vaudrey’s thoughts here.

A frustration from the survey has been the question, “How clearly does this teacher explain things?” I have barely been able to budge this one, and after my improvements on two classroom management questions it has hovered at the bottom for my last two cohorts, somewhere just below “Usually”. Not that this is a terrible result, or that explaining things clearly is the only thing that matters in my teaching, but I think it does matter, and I would love to find ways to practice that skill and improve students’ perception that I explain things clearly.

I think that this survey is useful, but it also has limitations. I think it could be complemented effectively by some open-response questions that ask students to talk about one area I’m doing particularly poorly in, and to solicit broader feedback than is possible using Likert scales. A survey is just one way of assessing my strengths and areas for improvement. But it only takes a few minutes to have students fill out a Google Form, and I’m most happy that I’ve stuck with the same questions over time so that I can compare cohorts and try to measure my own improvement.