There Is A Lot Of Room At The Top.

Mike Preiner
5 min readApr 23, 2021

--

We’ve spent a lot of time in previous posts discussing education gaps. While closing gaps is clearly the short-term priority, in this post I wanted to take an optimistic look at the longer-term goal: what would it take to get the students in our program above grade level? In other words, is it possible to leverage our insights to move the most disadvantaged students from “struggling” to “gifted”?

To answer this question, we’ll build on a couple of our recent posts showing that it is possible to get pretty high student growth rates (~2 grade levels of math per year) with the program that we’re running at Lowell Elementary. With that type of growth rate, even students that are very behind would be above grade level given enough time.

Of course, this raises a couple of fundamental questions:

  • Is it possible to maintain those types of growth rates for extended periods of time?
  • If so, how much work would required to make it happen at Lowell?

To answer the first question, let’s start by revisiting a graph from a previous post. It illustrates a couple key points from our students at Lowell. The first is that their historical growth rate is ~0.6 grade levels per year, which means they fall further behind grade level each year. The second point is that we’ve managed to reverse this trend: if we can maintain our new growth rate of 2.0 grade levels per year, we should be able to eventually get our students at or beyond grade level.

Diagnosed math grade level versus actual grade level. We include data from 2nd graders, 4th graders, and 5th graders in our cohort. The “Current Trajectory” is a project of what would happen if the 4th graders maintained their current growth rate of 2 grade levels per year.

Unfortunately, our data from Lowell doesn’t conclusively demonstrate that our program can even get students all of the way to grade level, much less above it! This is because we’ve only been assessing our students over a short time period of time: ~3 months. In other words, we simply haven’t had enough time to actually close the gaps, given how large they are.

Fortunately, we do have a couple of additional data points. It turns out that I’ve been running a pretty stripped-down version of the Lowell program with my own children for several years. It consists of using the same digital tools that we’re leveraging at Lowell, and it is very low-touch in the sense that I spend roughly 5–10 minutes a week actually discussing math concepts with them: the vast majority of content they learn on their own.

It turns out that my children’s elementary school uses the same program to run regular student diagnostics as we’re using at Lowell: IXL. You can read an in-depth discussion of how we are using it here. What are the results? Below, I show the same chart as above, but I’ve added the scores for two of my children. I’ve also added a fit for them and designated it the “Long Term High Growth Trajectory”.

A couple of things immediately stand out:

  • The bar of “at grade level” for math is actually a pretty low one. Students are clearly capable of far exceeding it with the right conditions. In other words, there is a lot of room at the top.
  • The Long Term High Growth Trajectory has a slope of ~1.7 grade levels per year. This is actually pretty close to what we are seeing with our “Current Trajectory” growth rates at Lowell.

There are a couple of natural objections to this comparison. The first is that my own kids are in a pretty different environment than most of the students at Lowell, both at school and at home. For example, they could be receiving very different Tier 1 (standard classroom) instruction than the students at Lowell. That is almost certainly true, given that the two schools use different math curriculum, despite both being part of Seattle Public Schools.

However, it seems likely that most of the growth our own kids are seeing is not related to their classroom, simply because of the content. For example, our 2nd grader is testing at almost a 5th grade level with fractions, and she hasn’t even seen fractions in school yet. Similarly, when our 5th grader tests at a 9th grade level for Algebra, it isn’t because he saw it in class. In other words, it is pretty clear that our kids are learning most of their new math content from the digital tools themselves, not from their classroom environment. This matches our experience at Lowell: the students there are clearly able to learn new content very quickly on their own using digital tools.

This seems to suggest that despite the differences in classroom environments, the answer to our first question: Is it possible to maintain the high math growth rates from our program? seems likely to be yes.

That brings us to our second question: What would it take to maintain high growth rates long-term for students at Lowell and similar schools? The short answer is that I don’t think we know quite yet. However, it seems safe to assume that it would involve some combination of the digital tools we’ve been using for our math program, in addition to support from parents, teachers, (and maybe tutors) to provide the motivation for the kids to actively use them. Encouragingly, the experiments with my own kids show that it possible to do it in a pretty low-touch (and therefore low-cost) fashion. I suspect that its possible to gradually move from our current high-touch program at Lowell to a much lower-touch intervention as the students get familiar with the tools we are using and build strong learning habits.

Of course, we still have to figure out how we can replicate our long-term impact at schools like Lowell in a practical way. However, we do have reason to believe that with the right tools, support, and enough time, it should be possible to help even the most disadvantaged kids go from “far behind” to “ahead”.

--

--

Mike Preiner

PhD in Applied Physics from Stanford. Data scientist and entrepreneur. Working to close education gaps in public schools.