We were looking for our equivalent of Facebook’s “7 Friends in 10 days” as a proxy for long-term engagement.

To our surprise, we found an interesting indicator that was not a threshold like “more than X edits is good”.

Rather, the sum of interesting actions in the first three days predicts long-term behavior well. Every single edit in the first three days adds to the likelihood of long-term engagement. So does every single login.

At SimpleSite, we now use the “Activation Index” – a sum of all edits and logins – as an invaluable tool in our upcoming Customer Success Project.

Facebook has a metric saying that a user is very likely to be retained, if he or she connects with seven friends within the first ten days after signing up. Then Facebook’s growth hackers optimize the service to make it most likely that new users cross that 7-10 threshold. It’s great to have such a very tangible metric. But it can be hard to find and without it, it’s really hard to systematically improve engagement in your service – as discussed in this post.

SimpleSite is a SaaS “Create and Maintain Your Own Website”-service. So in line with this, we would look for metrics like “adding more than four subpages”, “adding three pictures”, “performing more than ten edits” etc., all in the first few hours or days. The assumption was that such a threshold existed and that, once passed, it would indicate a “good” user with a high likelihood for converting and staying with us (for a longer period).

Improving engagement: What is your version of Facebook’s seven friends? #growthhacking #engagement

The surprise: No threshold

We started looking, but simply could not find a threshold that correlated well to the user being highly engaged. Instead, it seemed that the more edits during the first three days, the better, with no evident upper limit. The same thing went for logins during the first three days. That was kind of surprising. Specifically, we looked at all users that entered the system during January and February 2014. In August 2014 we analyzed:

  • Retention: The probability that they returned at some time more than two months later.
  • Revenue: The total paid lifetime we got on average from the users so far.

Here is our graph of the average retention as a function of the number of edits in the first three days. There is a very strong linear relation between the number of edits and the probability of retention, continuing up to a very high number of edits:

And here is our graph of the average paid lifetime as a function of the number of edits in the first three days. It has the same strong linear relationship:

It’s the same when analysing what the number of logins in the first three days mean to retention and paid lifetime. So our conclusion is:

Every single login and every single edit in the first three days increase engagement.

There is no meaningful threshold.

More edits and logins are always of value.

The sum function is a wonderful result!

Because all users can contribute to future A/B experiments. If we, for example, counted only the users that made it past 10 edits, we would get no data from the user with 5 edits that now has 6 or the user with 15 edits that now has 16. We would only get data from the users that actually go from below 10 edits and up. And that is just measuring a tiny fraction of the actual data telling you that you improved engagement.

Six days to significance is fast

It turns out that summing the number of edits holds so much data that when doing the statistics carefully, we can get significance at the 10% level in just about three days. So a three day window for measuring users’ activity and then three days more to stabilise enough data for significance – a total of six days – is enough for significantly measuring a 10% engagement improvement. And not just on some random engagement indicator picked on a hunch – this is on an indicator that we have shown to correlate well to long-term engagement.

A combined indicator: The Activation Index

We saw that both the users’ edits and logins predicted engagement well. Doing a multiple linear regression, we get an even better predictor of long-term engagement:

Activation Index = 5 x logins + edits

(where the number of edits and logins are measured in the first three days of the user’s life). In plain terms: a login is 5 times as valuable as an edit – but both are important.

Improving engagement: First find out what to measure #growthhacking #engagement

Next: Experiments against our Activation Index

So now the road ahead is clear. We can think up hypotheses for engaging users better, we can build quick experiments and with a turn-around of about a week, we can let the results of one experiment inform the next on a path towards higher engagement. What we do in the experiments is simply measure the improvement (or deterioration) in the Activation Index as defined above. One more thing lies ahead, to be thorough. As we all know, correlation does not imply causation. So we need to test a number of the experiments independently and make sure that when looking back at them in a couple of months’ time, an increase in the Activation Index really did cause an increase in long-term engagement. If not, we need to go back to the drawing board and find ourselves a better indicator.

Feels like “next level”

We know that we have previously done great things with Conversion Rate Optimisation (CRO) based on a solid methodology of structured experiments and knowing what to measure. We believe that we are now ready to do the same thing for systematically improving engagement and that is going to be a pretty exciting journey to embark upon. Stay posted for the results.

A sum of all actions works better than thresholds when measuring engagement in a service. #growthhacking #engagement


We were surprised to discover the “no threshold”-observation at SimpleSite. One does not read too many accounts of the “reliable engagement proxy” and mostly, what you find out there are threshold indicators (like the now infamous seven Facebook friends). The sum indicator was a surprise. And the wonderful thing about it is that it really spans a high volume, making it even more feasible than threshold indicators for doing fast experiments, if you have a lot of data, but not Facebook- or Google-types volume, so that measurement time to significance actually does matter. So, if you have a service in which it would seem likely that the user creates more and more value through continued interactions with the service, maybe take a look at whether you also have a good sum indicator to use as proxy and testing ground for long-term engagement. Have you measured your engagement factors? Share your experience here at Nordic Growth Hackers.

Morten Elk

CEO at SimpleSite