Can we measure the efficiency of math practice?

Mike Preiner
4 min readAug 19, 2022

--

We recently wrapped up our 2021–22 school year — you can read the summary here. One of the items the summary called out for future discussion was what drove differences in student growth rates: in other words, what causes some students to learn faster than others?

From a very practical standpoint, there are only two fundamental ways to increase how quickly students learn:

  1. Increase how much time they spend learning (quantity)
  2. Increase how efficiently they spend their time learning (quality)

I want to highlight that quantity in this context is not necessarily the type of homework that will determine your grade in a class. Instead, we’re talking about practice and learning: the stuff that you do to get really good at something, whether it is soccer, playing the violin, or doing math. If you don’t spend time practicing these skills, you’ll never become good — much less great — at them. And of course, how you spend your time matters! As any parent or coach knows, kids can be pretty inefficient with their time:) With that in mind, let’s look at what drives how students learn math.

To understand the relationship between quantity and quality in our program, we’ve found it useful to plot student growth rates versus the amount of time spent actively practicing. Our data for the 2021–22 school year is shown below. Note that we also did an analysis in 2020–21, but with a much smaller amount of data…but the results were quite similar.

What do we see?

Annualized growth rate (assuming a 9-month school year) vs. practice for students in three different schools. Practice time is defined as time spent actively using digital learning tools and is discussed further below. The asterisks denote two students from School 2 discussed in the text.

The first thing to note is the relationship between practice and growth rate: the more students practice, the more they learn. A simple linear model (the dashed line) shows that on average, an extra 60 minutes of practice per week corresponds to an extra 1.6 grades of growth over the course of a school year. I’d put this relationship into the “seems obvious but is still important to measure” category.

Next, our simple model still shows a growth rate of ~0.7 grades/year even for students that essentially didn’t do any extra practice at all: these generally are students who had very low attendance, or weren’t particularly engaged in the program when they did show up. This baseline growth rate is pretty similar to what we observed last year, and is almost identical to the average historical growth rate of our students (before they entered our program). Or in other words, “business as usual”. It is encouraging that these our two independent estimates of business as usual growth rates give such similar results.

Finally, we can use this analysis to start to learn about the efficiency of learning. We can see that some students learned much faster than others, even with the same amount of practice time. For example, we can see that the students in School 3 were all above average (the fit line) in terms of how much they learned given how much they practiced. We suspect that was due to a combination of two factors: they had a lower student/coach ratio and the selection process at this school favored more motivated students: they had to return a permission slip.

Similarly, when we dug into a couple of the less efficient students at School 2 (marked with asterisks), we were able to identify specific inefficient practices. One of the students had a tendency to practice on material that was too easy for her: she didn’t like working out of her comfort zone. The other student had a tendency to practice material that was too hard for her: she wanted to work on stuff at her grade level, even though she wasn’t ready for it yet. Both cases led to inefficient learning.

Most importantly, this type of analysis has shaped our thinking on how to increase the effectiveness of our program in the future. We’re actively experimenting with a few ways to improve the learning efficiency. These include:

  • Identify low-efficiency behaviors earlier. We’re creating weekly reports that can help us catch when students are are working on stuff that is either too easy or too hard.
  • Teach students skills to improve their efficiency. For example, teaching them to learn from videos or ask a coach when stuck. We don’t want them wasting time spinning their wheels.
  • Improve our understanding of efficiency. It is clear that there are several different types of efficiency that can be relevant. For example, we think that there are cases where our quantity variable should be direct dosage: the time spent “in class” in our program. In other cases, we may want to the definition of quantity to be total spent practicing (both in class and at home).

Of course, we’re also looking at ways to improve the quantity of practice. We got a good reminder of this opportunity when a student recently said: “Instead of 4 hours of video games this weekend, I’m going to practice math!”

To summarize: we’re now beginning to use data to separate out the two main components of faster learning: quantity and quality. This is letting us experiment with specific tactics to improve our program in both dimensions…something we’re very excited about!

--

--

Mike Preiner

PhD in Applied Physics from Stanford. Data scientist and entrepreneur. Working to close education gaps in public schools.