Can family engagement *improve* student performance?

Mike Preiner
4 min readSep 19, 2022

--

Schools believe that family engagement is important. For proof, any parent doesn’t have to look any further than their email inbox overflowing with messages from teachers, principals and district administrators. I chose a random month (October) from last school year, and counted over 120 messages from my childrens’ schools…more than 4 per day!

While some of the messages are necessary, like schedule updates, a significant amount of them seem to come from the belief that the more emails, the better. This is likely due to the well documented correlation between family engagement (among other family characteristics) and student performance. The research extends back to the 1960’s. This John Hopkins article has some nice background on the famous 1966 “Coleman Report” that kicked off the usage of data to shed light on these types of relationships. Today, almost every education leader I talk to mentions the importance of family engagement.

However, there is a pretty big gap between “family life influences a student’s academic outcomes” and “sending this particular email improves student outcomes”. And unfortunately, the research on that type of detail isn’t particularly robust. In this post we’ll pick apart some of the key steps needed to get actually measure how impactful a specific family engagement strategy would be at improving outcomes for students.

We can use a pretty simple model to understand how we’d use family engagement to improve student learning.

A model for how a family engagement program (utilizing emails or texts) would improve student learning. It can be generalized to describe other types of engagement, such as phone calls, in-person meetings, etc.

This simple model can immediately give us some guidelines to quickly estimate how effective an intervention like this could be. As an example, assume we’re going to prompt parents to take one of the most effective actions we’ve measured at The Math Agency: getting students to actively practice math. Data from our program shows that on average, an extra 60 minutes of practice per week corresponds to learning an extra 1.6 grade levels of math. In the following example, we’ll assume a “successful” action consists of motivating a student to actively practice for 10 minutes.

We can now estimate the impact of the program using our simple model and 4 corresponding metrics, shown below. The first is what fraction of parents enroll: if none of the parents join the program, we’ll have a tough time having an impact! Next, we have the frequency of engagement and % of messages that are read. Typically, these are inversely related: my childrens’ schools are on the “spammy” side of the the spectrum, and a lot of messages go straight to my junk folder. Finally, even if a family member reads a message, it only makes a difference if it causes them to do something useful; this is our Action % metric.

Now we can analyze two different, but reasonable scenarios: a realistic (pessimistic) one and a more optimistic one. The first thing we note is that the two sets of metric estimates give us average impacts that differ by almost a factor of 30. In other words, the details of implementation will really matter! To put these impacts in perspective, a typical student in our program will have a historical growth rate of 0.6 grade levels per year, so an increase of 0.27 grade levels is a very big impact (almost a 50% improvement!). On the other hand, 0.01 grade levels isn’t even noticeable.

How reasonable are these scenarios? We have some data on most of the key pieces. For example, during the 2021–22 school year, we sent families regular email updates on their students progress, and we saw read rates of 55–75%. You can see the detailed metrics in this program summary.

In order to improve that, this summer we asked parents about their preferred methods of communication. The survey results are shown below, and we can easily see that text is the most popular way to receive updates.

Based on that data we added the option for text communications, and immediately saw a huge jump in family action; in this case, how often they wrote unprompted responses to us.

Knowing that over 30% of parents took unprompted actions for every message we sent is a good starting point. Of course, getting students to practice is a different action than replying to a text, so we’d expect a different Action %. The good news is that we have the infrastructure in place to measure exactly how often (and how long!) students practice after their family receives a text. This is an experiment we’ll be running soon.

In summary, family engagement seems to have the potential to increase student learning. We’ve built the infrastructure to actively monitor all of the key measurement points needed to figure out its actual impact in our program. We’ll report on the results in a future post!

--

--

Mike Preiner

PhD in Applied Physics from Stanford. Data scientist and entrepreneur. Working to close education gaps in public schools.