Can competitions increase how much students practice math?
When it comes to increasing how much students learn, there are two fundamental paths:
- Increase the time spent learning.
- Increase the efficiency of learning (amount learned per unit time).
Just about any approach to increasing student academic growth falls into one of these two categories. In this post, we’ll examine the impact of a very specific strategy to increase the time spent learning: motivating students via competitions.
This strategy is becoming more common in education software: Duolingo (a very popular language-learning app) introduced their “Leagues” functionality several years ago, and they have been regularly adding features to it every since. IXL (math-learning software) recently added a “Leaderboards” feature, which uses a similar concept.
The general idea is fairly simple: learners compete against their peers on effort to win some sort of prize (which could be material, digital, or simply recognition). Of course, there are a lot of details about how any specific competition program gets implemented, which presumably influences the impact on students. To help us better understand this impact, we’ve been running experiments to see how competitions influence students participating in our Math Agency program.
The results of our first major experiment are shown below. During our 2021 summer program, we ran competitions every other week: whoever practiced the most (1st, 2nd, or 3rd place) would win tickets that could be exchanged for prizes. The results are pretty clear: students practiced an average of 1.7x more during competition weeks than non-competition weeks.
We ran a similar experiment at a different school (Northgate Elementary) in the fall of 2021 and got a similar results: in the second experiment competitions increased practice by 1.8x.
Of course, the average impact can hide a lot of variability. We know anecdotally that some students are very motivated by competitions, while others couldn’t care less…and some may actively dislike them. To help us understand how different students respond, we break out Fall 2021 results by level of impact, shown below.
As usual, we need to keep in mind that we have a relatively small number of students in our samples (n=26). We can see that some students (15%) practice way more(>30 minutes/week) during competitions, while for other students it isn’t as motivating. Interestingly, of the four students (15%) who didn’t show any increase in practice during competitions, three of them had limited attendance and made it to less than 50% of the scheduled classes.
Finally, it is worth noting that an increase in practice of 20–30 min/week is a material improvement. Based on our previous data, an additional 20 min/week of practice corresponds to gaining an extra 1/3 of a grade level in math skills over the course of an academic year.
In summary, introducing competitions into our program has materially improved practice time for a significant fraction of students.
What’s Next?
We fully expect different students to have different types of motivations, and there is no a priori reason why we’d expect competitions to be exciting for everyone. With that in mind, we’re beginning to experiment with ways to tap into other types of student motivation. For example, we’re now collecting data on how creating shared goals can impact practice time. The more we understand how to motivate individual students, the more impact we’ll be able to have on their academic growth.
Finally, we haven’t yet discussed the other critical pathway for increasing growth: improving the efficiency of practice time. It is easy to imagine a 5th grader spending a lot of time practicing kindergarten-level skills so they can earn a prize in class, but not learning anything:) The good news is that we’re also developing measurement tools to help us experiment with ways to improve learning efficiency…but we’ll save that for a future post!