How our partner became one of the top performing elementary schools in Washington state.

Mike Preiner
6 min readSep 29, 2022


One underutilized tool in education is the large amount of public data on student outcomes. In theory, it should serve as a natural bridge between academic research, which is usually tightly controlled but not always relevant to “ground conditions”, and practical decision making, which is less controlled and always has to deal with the realities on the ground.

In a lot of our previous posts we’ve analyzed the “small data” that we gather while running our programs at The Math Agency. For example, we recently discussed data showing students in our program doubled how quickly they learn math. However, there were some caveats, including the fact that we only had data for students in our program, which meant we couldn’t rule out the effects of a more general “post-COVID bounce”.

The good news is that Washington state recently released assessment results for the 2021–22 school year, and we now have the ability to compare our partner schools to the rest of the state.

All data used in this post is available from We use the “Assessment” and “Enrollment” Report Card datasets.

Before we dig into the results, let’s consider the conditions required to see the impact of a program in public data, which is always aggregated. In this post, we’ll be focusing on school-grade level data: e.g. “Northgate 5th graders”, and we’ll be analyzing the fraction of students meeting standards in math. To see an effect at this level, we’d need the following:

  • A program impact big enough to materially change the odds that a single student meets standards. This requires us to be working with individual students for a significant fraction of the school year. We won’t have a large cumulative impact if we work with any given student for only a month or two.
  • Work with a large enough fraction of “at-risk” students to make a difference when our impact is averaged across the entire grade. Working with just a handful of students won’t move the needle.

Of our 3 pilot schools, one met both of these criteria: Northgate Elementary. We’ll begin with their data, and simply look at how the fraction of students meeting standards (in other words, at grade-level) has changed over time for different grades. The results are shown below.

A few things immediately stand out. The 3rd grade scores in 2021–22 are still showing clear signs of COVID learning loss (though data doesn’t exist for them in 2020–21), while both the 4th and 5th grade scores show big increases from the prior year.

Figure 1. Fraction of students meeting math standards at Northgate as a function of time. The Math Agency worked with the 4th and 5th grade classes in 2021–22.

However, one issue with this type of analysis is that the students in each grade change each year: maybe the 5th graders in 2021–22 were just a more gifted bunch than the 2020–21 5th graders! To account for that, we really want to look at changes in a single cohort as it gets older. In other words, we should look at changes between the 2020–21 4th graders (less than 20% passing) and the 2021–22 5th graders (more than 60% passing). The difference, a little over 45%, seems impressive. But again, what if it was just part of a general post-COVID bounce?

The really nice thing about public data sets is that they allow us to put these type of gains in context. Next, we’ll compare the growth at Northgate to the growth of every other school in the state. The results are shown in the histogram below. We can see that Northgate has the 2nd highest math growth rate in the state.

Figure 2. Histogram of changes in the fraction of students meeting standards between the 2020–21 4th grade class and 2021–22 5th grade class for every school in the OSPI dataset.

We suspect the 2021–22 4th graders would also see a relatively large jump compared to their 2020–21 3rd grade scores. Unfortunately, as we mentioned earlier, we are missing that data due to COVID.

Of course, growth is good, but we really want to close educational gaps for students from disadvantaged backgrounds. With that in mind, let’s look closer at how Northgate is doing on an absolute scale. From Figure 1, we can see that 60% of 5th graders are leaving Northgate proficient in math. What does that mean in terms of education gaps? To answer that, we’ll plot the fraction of students meeting math standards for every school in the state as a function of the fraction of low-income students in the school. We do that because low-income status is by far the most powerful demographic predictor of test scores. The results are shown below.

In addition to the raw data, we also fit the results (the dashed line), and highlight a range of “similar” schools with the orange band. The size of each data point is proportional to the number of students in 5th grade at each school. What can we see? Northgate is now the #1 performing school in its demographic group in math.

That is a great first step, but we still have work to do. Getting Northgate to 80% or more of students meeting standards would make them one of the best performing schools in the entire state, regardless of demographics. This seems like a natural definition of “closing the gap”. Is this too ambitious? For perspective, this year we’ll be working with last year’s 4th graders. If we repeat our impact from last year and boost the fraction at grade level by another 45%, that would mean more than 85% of them would meet standards. It also matches work we did a couple years ago simulating what is feasible with regards to fully closing gaps. In other words, we think the data shows that it isn’t crazy to fully close the educational gaps in math at Northgate.

Figure 3. Percent of 5th students meeting math standards as a function of the low-income fraction for every school in Washington state. The dashed line shows a quadratic fit, and the orange area highlight schools that have low-income fractions within 5% of Northgate Elementary.

At this point, we should also mention some caveats with the analyses we present here. The biggest include:

  • Causality: school-grade level data is impacted by large number of factors, including school-wide attributes (building leadership, family engagement, etc.) and classroom effects (teachers). There are a lot of awesome educators and staff at both Northgate and the after-school program run by Seattle Parks and Recreation, and they obviously have made this happen! Northgate also saw real gains in English scores, which seems unlikely to be due to our program. In a recent post we discussed how in the future we may be able to isolate The Math Agency’s impact from these other factors.
  • Student enrollment: we operate as part of an after-school Community Learning Center run by Seattle Parks and Recreation. Ideally, we’d work with every single at-risk student, but not all of them attend the after-school program. As we discussed earlier, that will limit our portion of the work to fully close gaps.
  • Is proficiency too low a bar? It’s safe to assume that the top schools in the state have many students far above grade level. In that sense, comparing proficiency numbers may not be the right metric. While we do have some reason to believe that our program can take students quite a bit above grade level, we’ll focus on proficiency as an initial indicator of closing gaps.

In summary, last year Northgate Elementary became the #2 school in the state for math skill growth, and the #1 school in its demographic group when it comes to overall math achievement. They are showing that it is possible to make huge progress in closing educational gaps. Congratulations to both Northgate and the team from Seattle Parks and Rec!

Shameless plug: our partner schools need volunteers to help make this program happen. If you are interested in personally making a difference as a coach, you can learn more here.



Mike Preiner

PhD in Applied Physics from Stanford. Data scientist and entrepreneur. Working to close education gaps in public schools.