How effective can we make a summer math program?

Mike Preiner
6 min readAug 11, 2021

--

From an outsiders perspective, the summer months (equivalent to ~1/3 of the academic year) seem like a huge opportunity to close educational gaps for at-risk students. The research seems to bear this out. National assessment data shows that elementary school students on average lose over 2 months worth of math skills over the summer. The true cost is larger than that: if we assume that students could have been gaining 2–3 months of skills over that time, the real cost is 4–5 months worth of learning.

We wanted to see if we could flip the script on summer learning loss, and so we ran a summer version of our spring math program. We made a few changes: we upgraded our tutor training, simplified the curriculum, and added a few new components that we’ll discuss below. We also wanted to answer a few key questions:

1. Could we keep students at Lowell (a high-poverty, diverse school with a large number of homeless students) engaged and excited through the summer?

2. Could we increase family engagement with timely, personalized information about their students?

3. If the answer to 1 and 2 was “yes”, how much academic progress could we make?

A final bit of context: the summer program had ~20 students from grades 2–5. All of the students had been identified as needing additional math support and all were significantly behind grade level.

Can summer math be engaging, rewarding, and *fun* for students?

The short answer appears to be yes. Our summer math program was completely voluntary (and remote!), but nevertheless, we had attendance numbers that were similar or higher than what we saw during the school year. After accounting for schedule conflicts (for example, some students had competing summer programs for a portion of the summer), we had an attendance rate of 79%.

In terms of excitement and fun, we learned a lot about how competition can increase student motivation. We incorporated weeklong “sprints” into our program: during a sprint, students competed to see who could practice the most; prizes were awarded as part of the fun. The results were clear: for many students, competition is a big lever to influence their motivation to practice. On average, students practiced ~65% more during competition weeks. The effects can be clearly seen in the chart below.

Average practice time per student over the course of our summer program. It is worth noting that students practiced a lot: over 60 minutes per week on average! We’ve normalized the data from Week 8 to account for the fact that it was a short week (only 4 days).

This brings us to the next item in our toolbox for more effective learning…

Can we we increase student learning via family engagement?

This summer we tested out a new piece of our program: we sent parents weekly, personalized updates about their students. These updates combined hand-written notes from our tutors with automated data on very specific math skills that each student was working on.

The results were clear: over the course of the program, over 80% of our emails were read by parents. For anyone familiar with typical open rates for email campaigns, this is an impressive number: our parents are very interested in getting information about their student’s progress!

Email open rate (emails were sent to 1–2 family members per student) over the course of the summer math program.

In terms of connecting this part of our program to improved outcomes for our students, we aren’t yet in a place to make strong quantitative conclusions, but we have some pretty telling individual stories. For example, in addition to a large number of grateful replies from parents, one week we received a note from an upset parent. She was dismayed to see her son was currently working on subtraction when she thought he should be working on division. She didn’t want him doing “remedial” work and was considering removing him from the program. However, we were able to show her exactly which subtraction problems her son had gotten wrong that week, and how that influenced his personalized lesson plan. This convinced her that we really understand her son’s skill levels. After the following week’s email update she sent us a video of herself practicing subtraction with him! Even better, several weeks later he had moved on from subtraction and was making clear progress on multiplication/division. This sort of response gives me a fair amount of confidence that strong family engagement offers a path to improving student learning.

This naturally raises the fundamental question for our summer program…

What was the impact on student learning?

Our summer program focused on two specific areas of math: Numbers and Operations, and Algebraic Thinking. Essentially, this means a focus on addition, subtraction, multiplication, and division. As we demonstrated with our 2020–21 spring cohort, we’ve built a pretty robust set of student growth measurements into our program, and we could clearly see the results of our focused summer effort. The results (broken out by topic) are shown below. In addition to our focus areas, we see growth across almost every skill area. This is likely due the fact that when practicing independently, students often chose to work on skills outside of Numbers and Ops and Algebraic Thinking.

Average student skill level over the course of the summer program.

Over the course of the 8 week program, the students averaged an average skill gain of over 0.26 grade levels. However, to better understand the impact of the program on our students, we performed an additional comparison: we compared the Spring 2021 and Fall 2021 scores of our students to a group of similar students who didn’t participate in our program. The results are shown below. We can see that participating students gained over 0.4 grade levels overall (we know that many students continued practicing after our program ended), while non-participating students experienced learning loss. The students in our program had previously averaged 0.7 grade levels/year of growth, which means that over the course of our summer program their skill growth was equivalent to over 1/2 of their typical academic year. That sure beats learning loss!

Math growth between the end of Spring 2021 and the beginning of Fall 2021 for both participating and non-participating students. Both groups of students were academically similar before the program.

Given all we’ve learned, what are the takeaways?

Our spring program highlighted how much variability between tutors impacted student growth. This summer our tutors were high school students; definitely on the “untrained” side of the spectrum! Our summer results show that with the right training and structure it is possible to be very successful with a wide range of tutors.

The initial results with family engagement are also very encouraging. We’ve only scratched the surface, and I’m confident that we’ll soon be connecting increased family engagement to improved student learning.

We saw very promising growth numbers this summer; but it is clear that we still have a lot of room to improve our program. I won’t go into the gory details here, but we have clear pathways to improve almost every aspect of our program, which is one reason why I’m super excited for our next cohort of students this fall.

Finally, and most importantly, we’ve now shown that it is possible to both a) dramatically boost students growth rates during the school year and b) keep a high growth rate through the summer, a time when students usually regress. This is all while operating fully remote and online. The data makes it clear that the tools already exist to fully close the math gaps for students at even the most disadvantaged schools, if schools are willing to invest in using them.

--

--

Mike Preiner

PhD in Applied Physics from Stanford. Data scientist and entrepreneur. Working to close education gaps in public schools.