2023–24 Program Summary

Mike Preiner
5 min readJul 26, 2024

--

Our mission at the Math Agency is to close educational gaps in public schools. Shameless plug: if you are interested in helping support our partner schools as an academic coach, we’d love to hear from you!

During the 2023–24 academic year, the Math Agency worked with six different public schools in Seattle and Bellevue, supporting eleven distinct student cohorts. On average, students in our program doubled their academic growth rates, which is observable in both standardized test data and our internal measurements. This is similar to what we saw in 2022–23 and 2021–22, and is an important step towards our mission of fully closing educational gaps in public schools.

For those of you interested in the details, keep reading!

This year we ran our program in four public schools in Seattle: James Baldwin, TOPS, Whittier, and West Woodland. We served two Title 1 schools in Bellevue: Stevenson and Lake Hills. Four of the six schools were new partners, so we learned a lot about getting several new programs up and running simultaneously. The biggest discovery was that we have the data infrastructure to quickly (within ~4 weeks) tell if a program is running effectively or if we need to make changes. This is useful learning for us as we look to scale our impact across more schools in the future.

By the end of the year, every single program was running effectively. The table below shows the year-end results. For each cohort (we denote the grade after the name), we show the number of students, their average historical growth, two measures of their growth during the 2023–24 school year (using Khan Academy and IXL), and the results of the NWEA Measures of Academic Progress (MAP) year-end assessment. The logistics varied quite a bit across cohorts: we ran in-school, after-school, in-person, and remote (via Zoom) programs.

Students entered our full-year programs having historically learned 0.7 grades/year of math on average. That means each year they had been falling 0.3 grade levels further behind grade level. During their time with us, they averaged ~1.4 grades/year of growth (measured via IXL).

Both Seattle and Bellevue schools also have district-specific standardized assessments. Seattle uses the MAP, which measures year-on-year growth of students, and reports “% of Expected Growth” metric, which compares each students’ growth to similar students across the country. A value of 100% means a student grew the “expected” amount, based on their grade level. According to the MAP assessment, the average growth for students in our program was 165% of expected.

Figure 1. Summary statistics for our 2023–24 academic year programs; the grade of each cohort is indicated in the name. We estimate academic growth rates using data from both Khan Academy (KA) and IXL. Cohorts marked with an asterisk (*) were partial-year programs that started in the middle of the school year. In School 2 we ran a remote after-school program via Zoom, and in School 3 we piloted a program supporting advanced learners (HCC = Highly Capable Cohort), in contrast to the other programs that supported students behind grade level. Schools 5 and 6 were in Bellevue School District, which uses the STAR assessment instead of the MAP.

While we are thrilled to see such high growth from our students, we are also interested in understanding how much of that growth is attributable to our program. Many things can have a large, positive impact on student growth: school leadership, great teachers, other intervention programs, etc., so it is important to control for as many of these effects as possible. The simplest way is to compare growth for students enrolled in the Math Agency to those who were not enrolled. If we do this for each grade, we are generally controlling for both school-level effects and classroom-level effects. The results are shown below for all of our full-year programs. To compare across district assessments (as mentioned above, Seattle and Bellevue use different tests) we plot excess growth (actual growth — average growth from similar student), normalized by the standard deviation (SD) of each assessment’s growth metrics.

Across all full-year cohorts, students in our program averaged 0.53 SD more excess growth than their non-enrolled peers. That compares favorably to academic research on high-intensity tutoring. Estimates of program impact vary widely from study to study, but one recent meta-analysis found an average impact of 0.38 SD across a range of math tutoring programs. Interestingly, in almost all of our grade-level cohorts, students not enrolled in our program also grew significantly more than average, though not as much as those who were enrolled. We suspect this may be a “spillover” effect. For example, for our in-school programs, we work with a large number of struggling students. When these students are working with our coaches, their teachers are able to serve a smaller number of students who are all closer to grade level, a situation clearly conducive to effective group instruction.

Figure 2. Plot of excess growth (actual growth minus average growth from similar students) normalized by the standardized deviation of the growth metric. Growth norms for the MAP can be found here. Non-enrolled student grew 0.43 SD more than expected, while students enrolled in our program grew 0.96 SD more than expected, giving a difference of 0.53 SD between enrolled and non-enrolled students. Schools 1–3 used the MAP assessment, while Schools 5 and 6 used the STAR assessment.

While we are excited about these results (and very grateful to the amazing coaches who made it happen), we still see a lot of room to improve our impact. To illustrate this, we plot a histogram of individual student growth rates below. While the average growth rate in our program was ~1.4 grades/year, we see that roughly one third of students still had a growth rate of less than 1.0 grades/year. We believe there are many ways to identify and support these students more effectively; we’ll be exploring some of these in future posts.

Figure 3. A histogram of student growth rates (measured via IXL) for the 2023–24 academic year. The average growth rate was 1.4 grades/year, and roughly 1/3 of students had a growth rate of less than 1 grade/year.

In summary, for the third year in a row, students in our program have doubled their historical growth rates, learning an additional +0.7 grades/year of math on average. This is despite our program growing significantly: almost doubling the number of students we served and starting programs in 4 new schools. These large growth rates are visible in both our internal growth metrics and in district-level standardized tests, and compare favorable to academic research on high-intensity tutoring. While we are encouraged by these results, we think it is possible to further increase student growth rates via program improvements.

We would like to thank the principals and teachers at our partner schools for being great collaborators in accelerating student growth. We’d also like to acknowledge the Schools Out Washington: Best Starts for Kids Expanded Learning Program for their support of the after-school program at James Baldwin Elementary (in partnership with Seattle Parks and Recreation).

--

--

Mike Preiner

PhD in Applied Physics from Stanford. Data scientist and entrepreneur. Working to close education gaps in public schools.