At SimpleSite we use long-term metrics to re-evaluate old growth hacks. That paid off well on a video tutorial we implemented in our signup wizard.

We wanted to provide ways to engage our users when they landed on our site after signing up for a trial and we had an idea that providing a video on how to use the site would most likely be a good way to do this.
Everything we read on the issue seemed to back up this idea (a lot of resources make this claim), so a decision was made to build and test a video experiment on our site. The first version had two modes, one that showed the video during the sign up process and one that was optional (and slightly hidden) on the user’s dashboard.

screenshot_introvideolightbox_v2

At the time, we were sure this was the right approach, but when we checked the stats, found out that it really wasn’t.
The experiment tanked, the video that played during sign-up made users close their browsers (most likely in anger or confusion) and the video that was optional was never played, thus having no effect on any of the parameters that we wanted to influence.

After running the experiment for a short period, the first version was canned and another version was designed and built. The new experiment was shown in a popup when the user was done with the sign-up process, the video was shorter and, basically, we were sure that this was the experiment that would prove our hypothesis.

screenshot_newintrovideolightbox_v2

Despite all our expectations and good intentions, this experiment did just as poorly on all parameters.
Fewer users paid for the premium version, fewer users engaged with the site and, generally, it was a bit of a flop.
Of the users that were presented with the popup, very few had actually clicked play on the video and even less, 5%, had actually finished watching the video.
When looking at those stats, we decided that it was best to discard the experiment and focus our efforts elsewhere. This would most likely have been the end of this implementation, had it not been for our monthly “old experiments” analysis meeting.

Why revisit a failure?

A lot of resources on growth hacking focus on specific tweaks and experiments that can improve specific areas of your product like conversion and engagement of users. The general idea is: “Do this and you will succeed”.
The reality is that most of the times you attempt to apply the knowledge of others to your product, the first run-through is not likely to do as well as you expected. The obvious response to this is to keep trying until you succeed and keep measuring the short term parameters that provide the quickest conclusions.

At SimpleSite we have found that another very important aspect is to also spend time examining old experiments for more long term parameters that can provide insight, that the short term parameters do not.
So we revisited the available data on the users that were exposed to our video experiment.

Video viewers turned out to be more engaged

In our “old experiments” analysis meeting, every single experiment we have ever run is measured by long-term parameters; specifically we measure the life-time value of the users that have seen a specific mode of an experiment.
This is measured through an average time on site and, for paying customers, the average of the total amount of subscription days.

The second iteration of the video experiment stood out. Even though the total number of subscriptions for users that had seen the experiment was down, the average amount of paid days for those specific users was up. In other words, the few users who had seen the video would actually go on to become better customers.

Reintroducing the once failed video

The realisation made us reintroduce the video experiment, almost a year after the initial experiment had run.
It was obvious that the video actually had an effect, but not enough people had seen it. The few that did, benefitted from it, but the vast majority just closed the window. We needed to make sure that more people watched the video.
The new version was changed to introduce a different skin for the video player, with a bigger play button, less text and the text was worded differently.

screenshot_NewIntroVideoLightbox_tweaked_v2

The result of the tweak was obvious from the beginning.
The amount of users that started the video increased from 13% to 20% and the amount of finishes increased to 10%.
These numbers are still not very high, but just this small increase in viewers produced a 30% increase in user engagement and an increase in payers with an expected higher LTV than users who were not exposed to the experiment.
In the near future, we will most likely implement a brushed up version of this video experiment including the lessons we’ve learned from the various iterations.

Considerations

We would certainly have introduced engagement videos to the site at some point, but had we not taken a step back and analysed this specific implementation, we would not have reached the conclusions that we did. Based on this, a couple of points can be summarised:

Log everything (that makes sense): This might seem obvious, but sometimes you get caught up in the “narrow metrics”, which means you don’t notice the more specific issues with that experiment implementation. Had we not implemented (excessive) logging on the second iteration, we would probably not have realised that the main issue was, basically, that it was too easy to close the popup which meant too few users actually saw the video.

Try, fail and try again: This should be self-explanatory and might seem like a cliché, but don’t be afraid to fail. Even failures provide valuable insight.

Use both short- and long-term metrics: Short-term metrics like clicks, trials and conversions give you a quick indication of how your experiment is doing and are obviously the primary thing you should be measuring. However, long term metrics like LTV are important to show how your experiment actually affects the behavior of your users. Even though the experiment might not provide the changes you expect, analysing long term metrics might give you different insights.

Always take the time to analyse old experiments: When you are focused on growth, you have a tendency to chase the next “big thing” that will improve a key metric for your product. This leaves little time to actually take a step back and analyse your previous experiments. We have learned that in many cases, very valuable insight can be gained by systematically going through old experiments and analysing them through the long term lens. This in turn can produce even better experiments and better results, which will, in the long run, keep you on a path to sustainable growth.

Christian Stautz

Growth Hacker at SimpleSite
CategoryExperiments