Attempting to close educational gaps: an initial estimate of program impact.

Mike Preiner
5 min readMar 11, 2021

2021–03–23: I’ve updated our estimates of growth rates based on our latest data, which is shown in more detail here.

A quick recap:

  • We previously described our experiment to close math gaps at Lowell Elementary in downtown Seattle.
  • Our last post described the system we’ve built to measure our progress, and I showed what results look like at the individual student level.

In this post I’d to like talk about some early estimates of our program’s impact. As a reminder, our student cohort consists of ten 4th and 5th graders who began our program in late October.

For the folks who just want the summary: our initial results show that our students historically averaged about 0.6 grade levels of growth per year. Since starting our program, the average math growth rate is currently~2.0 grade levels per year. Now for the details!

The bad news: these kids are way behind in math.

We now have much more detailed data that supports my early (six week) assessment: the students in our program are significantly behind where they should be. More specifically, the 4th graders are an average of 1.6 grade levels behind, and the 5th graders are 2.5 grade levels behind.

How did they get there? Well, we know that on average they must be learning less than one grade level of math each year. Based on our first assessment, the students learned an average of 0.62 grade levels of math each year across their entire elementary education. At this rate, their math skills will be at an 8th grade level when they graduate high school.

Plot of average diagnosed math grade level versus actual student grade level. The 1:1 line is shown in gray. We can see that both the 4th and 5th graders are significantly below grade level, and the gaps get worse over time.

Those numbers are pretty disheartening, but they match our previous work showing math gaps tend to increase over time. They’re also in line with what we’d expect based on state assessment data.

The good news: we’ve drastically reversed the trend.

We’ve been assessing our students pretty regularly since late January. At this point we’re starting to have enough data to make a reasonable estimate of their current growth rates. The news is good: the average growth rate across our entire cohort is now ~2.0 grade levels per year.

There are a couple of ways to put this number into context. The first is to note that we are well above 1.0; this means they are now catching up instead of falling behind. To show this more clearly, we’ll start by plotting the students’ previous trajectory of 0.62 grade levels per year: this is shown in the chart below as the dashed gray line. Now let’s look at what happens if we maintain our new growth rate: the orange line in the chart shows the projection for the 4th grade students. We can see that in roughly two years they’ll have completely caught up, and after that they’ll be above grade level.

Plot of average diagnosed math grade level versus actual student grade level. We’ve also included the students’ historical trajectory (dashed gray line), and a projection of their future trajectory if they maintain their new growth rate (orange line).

Of course, it’s reasonable to ask what happens if we start our program earlier, when there is less of a gap and we can influence the students over a longer period of time. We’re currently testing this idea with 2nd graders! Based on what we’re seeing so far, I suspect we’ll find a similar impact on trajectory.

These initial results are pretty encouraging, but there are a fair amount of assumptions and caveats that I’ll lay out in more detail. Most of these caveats will be cautionary. However, there is one really positive piece I’ll call out first: our initial growth number is for our entire cohort. In other words, we’re counting every single student that began the trial, and we’ve had zero students leave the cohort. This means we don’t have any issues with survivor bias. I also think it means we have a pretty engaging program, even in a remote learning environment.

Important notes and caveats:

  • My biggest concern is that we are looking at a small (n=10) group of students and we’re still pretty early in our measurement cycle. I’ll do a deeper analysis of the noise level of our measurements in future posts, but it’s safe to assume our estimate of our students’ growth rate will change over time.
  • To calculate an annual growth rate, I’ve taken our monthly growth rate and assumed at 12-month year, as our program should translate really well to Lowell’s summer school. If we assume a 9-month year we get a growth rate of 1.5 grade levels per year, which is still enough to be closing the gap, albeit more slowly.
  • I’m comparing students to their historical averages. Obviously this isn’t an average year! It seems likely that remote learning started these kids with an even lower than normal growth rate this year. On the other hand, I know they have some great teachers and interventionists this year. To get a better understanding of these effects we’d need to compare the growth rates of our cohort to other students at Lowell this year, which may be possible later in the year with more data.
  • We are making assumptions about future progress. Is it possible to maintain high growth rates like this? I’m pretty confident the answer is yes; but this is another topic I’ll address in a future post.

What does it all mean?

Overall, our initial results are very encouraging; they also seem reasonably close to some of our previous estimates of what we’d see with intensive tutoring. They reinforce the idea that we already have the tools to fully close our educational gaps; it is mostly a matter of putting our resources in the right places. It’s also worth noting that we’re just beginning this program, and we’ve got lots of ideas on how to further increase the impact.

To put a student perspective on this, I’d like to tell the story of one of our students; let’s call him Shem. His family is from Eritrea, and he is still working on his English skills. He’s currently in 4th grade, and lives with his mother and younger brother at Mary’s Place, a local homeless shelter. At the beginning of the year, Lowell ran a math assessment, and Shem had the lowest score in the entire grade. And remember, Lowell has a lot of really disadvantaged students. His assessment matched what I saw: many 2nd grade math exercises were a struggle for him. He was clearly nervous about doing math, and hitting “submit” on any math question was super stressful for him.

However, Shem really wanted to learn, and was willing to practice, just like his idol, the anime character Naruto. He worked almost every day, and often on weekends (something our digital program allowed him to do). Both his skills and scores have grown dramatically over the year, and he is currently on pace to be above grade level in about 9 months. As you may imagine, his confidence has shot up along with his skill level.

Not every student will be like Shem. In fact, most won’t be. However, our data suggests that it’s feasible to get the vast majority of currently struggling students to grade level or above before they are done with elementary school. I’m super excited to help make that happen.



Mike Preiner

PhD in Applied Physics from Stanford. Data scientist and entrepreneur. Working to close education gaps in public schools.