Public data shows we have the tools to fully close educational gaps

Mike Preiner
5 min readSep 8, 2023

--

At The Math Agency, our mission is to close educational gaps in public schools. In this post, we’ll sketch out how we can use public data to both a) clearly define how we’ll measure what it means to “close gaps” and b) show some examples that demonstrate we have all of the tools and programs needed to fully close gaps at most of our schools today.

Let’s start with our definition of the education gap. For the rest of this post, we’ll use publicly available data from the Office of the Superintendent of Public Instruction (OSPI) in Washington state. Below we show the percent of fifth grade students meeting math standards on the Smarter Balanced Assessment (SBA) for every elementary school in the state. We’ve plotted it as a function of the fraction of low-income students, which is by far the largest predictor of school performance. What do we see?

At a typical “no poverty” school, about 80% of the 5th graders typically meet basic math standards. At the other extreme, at very high poverty schools, only 20% do…meaning 80% of students there don’t meet standards. We’ll use this as our definition of the education gap. Based on this, if we can get high poverty schools to have 80% of students meeting standards (e.g. on par with the no-poverty schools) we’ll have fully closed the educational gap. Of course, one could argue this is too low of a bar, and that our goal should be to also hit parity on other metrics, like the percent of students in advanced learning programs. We’d agree with that, but we have to start somewhere, and we think hitting the 80% number is a good initial goal.

Figure 1. Plot of the percent of students meeting state math standards versus the fraction of low-income students for each school in Washington state. Each dot represents a single elementary school, and the dot size is proportional to the number of students at the school. The dashed line shows a polynomial fit to the data.

It is worth noting that the education gaps in our public school system have been remarkably persistent over time. They essentially haven’t changed for the last 25 years, despite large amounts of attention (and funding) spent on them, especially recently. With that in mind, it is reasonable to be skeptical about the claim that it’s possible to close them…so let’s look at the evidence!

Next we’ll focus on the OSPI data from schools that have partnered with The Math Agency to close educational gaps for their students. We’ve collected a lot of data on our program; but we certainly don’t claim to be the only way to close gaps. Instead, we view our data as showing the feasibility of closing gaps; there are surely other effective tools and programs.

It’s worth noting that we’ve spent a lot of time analyzing the academic growth of the students enrolled in our program using a wide set of metrics. For example, we found that on average, students in our program doubled their academic growth rates, and are on trajectories to be at (or above) grade level with two years of programming.

Now we’ll use the public OSPI data to look at school-wide results. This includes both students enrolled in our program and their peers who are not working with us. We’ll focus on the growth of specific school-grade cohorts over time. For example, if only 20% of a 3rd grade class is meeting math standards, we’d need to increase the number of students by 60% over two years to have 80% of them meeting standards by 5th grade. It’s important to note that due to capacity constraints, The Math Agency typically works with 15–50% of the students in a particular cohort. Our capacity (and which students we work with) will clearly impact our ability to achieve our goal of getting 80% of students to meet standards; an issue we analyzed in a previous post.

We show the change in the percent of students meeting standards in the Smarter Balanced Assessment for six of our cohorts below. These results include all of our full-year cohorts from the 2022–23 school year. What do we see? Our partner schools’ cohorts have shown increases in the percent of students meeting standards ranging from 13% to 43%. These changes are measured over a single year, and so running the program for two consecutive years should double the impact. As we can see from Figure 1, increases of 26–86% in the number of students meeting standards is enough to fully close educational gaps at almost any school.

Table 1. Changes in the percentage of students meeting standards in math on the SBA compared to the previous year. For example, the change of a 2022–23 4th grade cohort would be calculated by looking at the difference of the 4th grade scores in 2022–23 relative to their 3rd grade scores in 2021–22.

While we are encouraged by these results, we also see two clear ways to significantly increase our impact. The first is to increase our capacity: we aren’t yet able to work with all of the at-risk students at our partner schools. At Leschi, for example, we were only able to work with less than half of the at-risk students. Shameless plug: if you are interested in helping close educational gaps, we’re looking for coaches to help support our partner schools. The second way to increase our impact is to leverage summer programming. Our data shows that the right summer program can greatly increase how quickly students can meet grade-level standards, but we haven’t yet fully integrated summer programming with our school-year cohorts.

Finally, it is worth mentioning an important caveat with the data discussed above. Students’ academic growth is impacted by many things of the course of the year: parents, teachers, and school staff all can have a big impact on learning, and the data presented here doesn’t allow us to distinguish between the different factors. Some of our future posts will dig into these factors in more detail, but in all of the cases in Table 1 we should be applauding the hard work of everybody in the school. In any case, the data makes it clear that under the right conditions, we (as a community) already have the tools to fully close educational gaps.

One natural question is around cost. We’ll go deep on the cost-effectiveness of our program in an upcoming post, but as a summary, our program is about 3x cheaper (on a per-student basis) than the most common interventions that public schools currently use to address educational gaps.

With that in mind, this all has one big takeaway: the public data in Washington state shows that we have the tools to fully close educational gaps today.

--

--

Mike Preiner
Mike Preiner

Written by Mike Preiner

PhD in Applied Physics from Stanford. Data scientist and entrepreneur. Working to close education gaps in public schools.

No responses yet