# Can we repeatably close education gaps?

This question is core to our mission at The Math Agency, so while we’ve asked it before, we are asking it again here with some new data from 2022–23.

Based on what we learned in the 2021–22 school year, we made some changes to our program. Most notably, we’re now:

- Working with more focused cohorts. We now try to work with all of the at-risk students in targeted grades at our partner schools. This will make it easier to see our impact in public datasets.
- Increasing our dosage. Our data is clear: the more we work with students, the more they learn. So we’re now spending more time with them each week.
- Leveraging community volunteers. Increasing our dosage while also increasing the number of students requires more coaches. We’ve doubled down on helping our partner schools find engaged volunteers from the community. Now almost all of the coaches in our partner schools are local volunteers.
**Shameless plug:**if you are interested in personally helping to close educational gaps, consider coaching for one of our partner schools!

These changes have made a measurable difference in our program impact: we’re seeing student growth rates that are ~50% higher than last year’s. Last year students went from learning 0.6 grades/year of math before our program to learning 1.2 grades/year of math after they enrolled. That’s an additional 0.6 grades/year of learning over their historic growth rates. **This year our students are averaging an additional 0.9 grades/year of learning over their historic growth rates.**

To visualize this impact, we like to plot student skill trajectories, as shown below. In these plots we show skill level (*measured* math skills) versus the student’s actual grade level over time, averaged over the students we work with in each school. The plots have four key pieces:

- The “skill-level = grade level” line: this shows where a student
*should*be if they were learning exactly one grade’s worth of content each year. - Student trajectories before enrolling in The Math Agency: this is their
*historical*trajectory. Because we work with struggling students, this is always below the “skill-level = grade level” line. As an example, if a 4th grade student comes into our program with 2nd grade math skills, their historical growth rate (i.e. the slope of the line) would be 2/4 = 0.5 grades/year. - The students’
*current*trajectories in The Math Agency. This shows their average growth rate since they’ve been enrolled in our program. - The student’s projected trajectory
**if**they*additionally*participated in our summer program: we previously found this adds an additional 0.4 grades worth of math skills. This part is more hypothetical, since the students in our current cohorts haven’t participated in our summer program yet.

Finally, there are a few other pieces to note:

- At School 1, we serve both 4th and 5th grades. For simplicity, we just show the trajectory for 4th graders. At School 2 and School 3 we are working with 3rd graders.
- We began our program at School 3 in January, hence our new trajectories start midway through 3rd grade.

# Let’s look at the data!

The skill trajectories at the different schools are shown below:

The trajectories show a couple things quite clearly:

**The first is that our students were initially on track (on average) to be anywhere from 1.5 to 3 grades behind in math by the time they start 6th grade. **Since we are explicitly targeting the *most* struggling students in each school, this isn’t surprising.

**The second is the students are now on track (if we assume they participate in summer programming) to be above grade level in less than two years at all of our schools. **The exact time varies by school, and takes longer if we don’t assume any summer work. This is very encouraging: it is actually shorter than we estimated a few years ago using academic research data. However, there are a few important caveats to keep in mind with this analysis:

- For simplicity, we’re plotting the
*average*student trajectories in each of our schools: there is quite a bit of variation across individual students. - Our trajectories
*assume*that students would have continued on their “low-growth” trajectory. However, there is a lot of academic research that supports this assumption. - We also
*assume*that we are able to maintain our high growth rates over time. We don’t have much longitudinal data yet, but we are seeing that students in the second year of our program are showing larger growth rates than in their first year, which suggests it is possible to maintain high growth rates. - We’ve seen good correlation of our internal measurements with standardized state assessments in the past. However, we’ll know a lot more about our current impact after state assessments this spring.

# What is the takeaway?

**The data suggests that we have the ability to help our partner schools close educational gaps within two years. **We are currently seeing this across three different schools and with three different versions of our program format. This is very encouraging for our mission, and naturally makes us wonder: does our public school system have the resources to sustain this, or even make it happen more broadly? We’ll take a look at that question in an upcoming post.