Leading Indicators: How we learned what to do today to prevent churn tomorrow (and closed a Series B round).

When a customer leaves you

You have probably seen this scene before, perhaps you have even played a role in it. A customer cancels his service’s subscription, either the gym or the language classes, anything related to a goal he had wanted to achieve for some time. The clerk asks him the reason why he is leaving, and if there is anything he could do to change his mind.

The customer answers that everything was all right, the service has been great and he’s leaving just because his money is short. Or because he hasn’t had enough time to show up lately. He may even say he’ll come back in the future. The classical excuses for a break up.

“It’s not you, it’s me”

I’ve seen this scene over and over here at RD, my company, or in other companies I’ve worked with. 60 percent of the customers who churned lied about the reasons.

I don’t know about you, but my day has 24 hours as everyone else’s, and I do choose how to spend it. Also, when the cash flow is low, either you go bankrupt, or you choose how to spend the money you have left. Certainly you’ll keep the employees, tools and services that you care the most.

So, when a customer says such things what he’s actually saying is that he doesn’t want to spend the money or the time he has on you anymore and he is obviously avoiding your attempts of persuading him to change his mind. That’s why all churn surveys are poor and mostly useless.

That’s because churn had actually happened months before. The reasons that led to it had run through the customer’s mind several times before and now he has finally made up his mind. At this moment, something psychological has changed that will make it very hard for you to persuade him otherwise.

The cohort analysis

The reasons for churning occur months before the actual cancellation, so it makes difficult for us the vendors, to figure out what and when we’ve screwed up. A cohort analysis of churn rate helps us to evaluate faster and better the efficiency of the changes we’ve been making to prevent churn. It also helps us to figure out the most common periods in which customers are churning.

The chart seems complicated to understand but it is actually easy and very helpful. The rows represent the customers cohorts, or the group of every customer you brought in each month. The columns represent the number of months after the initial sale. The cell values represent the percentage of customers from that cohort (row) that remained paying for the service on that given month (column).

You can discover different things when you analyze columns or rows. On this figure for instance, if you look at the columns, you can see a drop on the retention rate for every cohort on the sixth month. Something is happening there and you can ask yourself why. Is that the contract period? Is it the end of the implementation phase? What else could be happening on the sixth month?

In that same chart, looking at the rows, you can also see a drop in the retention rates after June. You can ask yourself what you did in June that made that happen.

Something similar has occurred to us here at RD. A few months after we implemented the minimum contract period we saw a drop in the retention rates of new customers. It was easy for us to see it because we were looking at the cohorts separately and not to single churn rate. The drop in the retention rate affected only the cohorts that entered after that change. Easy to see, easy to fix.

Now, you may be thinking that using cohorts is the key, but as you can observe it took us four months to realize the problem. To realize that a project we had rolled out to prevent churn was actually delivering the opposite result, more churns. Four long months.

The cohort analysis is really helpful, but the problem about using it is that you’re still assessing the same metric: the churn rate. We already know that the reasons behind the churn occur way before the actual cancelation.

“The churn rate is a lagging indicator”

What you need are leading indicators. You need numbers that will help you to predict churn. Numbers which point to reasons that might lead a customer to churn in the future.

Leading Indicators

RD is the leading Marketing Automation Platform in Latin America. We help our customers to do Inbound Marketing to attract visitors, convert them into leads and nurture those leads. The same industry where you will find Marketo, Hubspot or Eloqua. We’d been growing 3x a year and had already raised two rounds of capital in Brazil when we decided it was time to play on the A League. So, last year we started fundraising in the Silicon Valley. Just to find out that the best VC funds in the bay area are far less condescending with churn rates than any other.

At that moment, we realized that we wouldn’t be able to reduce the churn rate that much within the time frame for the fundraising . So we invested a great deal in creating a good set of leading indicators that could help us predict our future churn. Those leading indicators should show us if customers were extracting value from our software and willing to stay. They should show us that we were on the right path to reach our target.

Below I’ve outlined all the creation process of this incredible leading indicators report. A report that was praised by most of the VC funds we talked to, most of which said it was the best customer success report they’d ever seen. The report was crucial in the fundraising negotiations, helping us to close the investment deal with TPG in the last quarter.

Start with correlations

Getting back to the fact that churn is a lagging indicator and that you need leading indicators, one question comes to mind. Is it possible to predict churn? Yes, it is! It all starts with correlations.

You can look for any piece of data that correlates to churn. It’s quite simple, just gather all the data you can about your customers. Then break customers down into two different buckets along with their data, the ones that have already churned, and the ones that have not. Next, compare the average of each data in those two different buckets. When you see a significant difference it’s because you found a correlation.

Software Usage

The most common data that correlates to churn is the usage of the product or service. Most of the times when the customer is about to churn he’ll stop using the product before he actually cancels. So, if you’re frequently looking at the customers’ usage levels, you can act right after a drop happens. That’s when you contact this customer to prevent the cancelation. In other words, you try to get him back on track.

Sticky Features

Usage is good, however it’s not any indiscriminate use that will make customers stay. You can track the usage of every different feature of the product. That’s what we did here at RD. By doing so we noticed a huge difference on the email marketing usage between customers that were staying and customers that were leaving. The difference was so huge that it became obvious to us that the customers who were not sending email campaigns were about to leave. Therefore, email marketing was a sticky feature and we had to encourage them to use it to prevent churn.

On the other hand, there were certain features that both types of customers were using with the same frequency. Our SEO Panel for instance. We realised that adding more or less keywords to it had nothing to do with the probability of churn. So, if a customer was using the software a lot, but if he was using only the SEO Panel, he could still churn because the SEO Panel wasn’t a sticky feature.

Customer’s KPIs

It took us some time to realize the importance of the lead management feature. When we looked at the usage rates we saw no relevant correlation to churn. Both customers that churned and stayed were using it with the same frequency. So it made us think that it wasn’t an important feature. But it was only when we started analyzing the results the customers were having with that feature that we found out we had been misguided.

Customers that were staying were generating 10x more leads than the ones that were leaving. Thus, we had to encourage lead generation and pay close attention to customers that were not generating enough leads.

Support Tickets

Something interesting puzzled us when we checked how many times customers were filing support tickets. We assumed that the customers who were leaving had had trouble with the product and complained about it. But, as it turned out, 70 percent of the churns had never talked to anyone in the support team, not a single word.

The leading indicator there was silence. Exactly, silence is the most anguishing leading indicator. If they are not willing to talk to you, they are probably not using the service, or they simply don’t care.


That was it. After all those analysis and correlations we’d done we had now a whole bunch of concrete leading indicators and we knew exactly what to do. I gathered the team, all the CSMs, and gave them a clear mission: attack the leading indicators. Make the customers use the product, have them sending more emails, help them generating leads, encourage the support tickets, check for anyone off track. There was no doubt the retention rates were bottoming out; except for the utmost failure that came upon us a few months later.

You need a focus: Customer Success

The lack of focus was certainly a problem. You cannot tackle everything at once and expect decent results. But there was still a bigger flaw on the whole strategy with those leading indicators: the correlations themselves.

“Correlation does not imply causation”

It’s obvious that a customer will stop using a service when he is about to cancel it. And he is certainly not seeing the results he’d like to drop out of it. And since he’s decided to go, he’s definitely not willing to invest time trying to solve everything through the support channels.

The problem with those leading indicators was that they were too much focused on the numbers themselves, on the company’s problems and not on the customers’ problems. Even though they were predicting churn in some way, they were still part of the symptoms and not the disease. So, we started it all over again with the reasons that were leading to the churn.

Yes, I said most of the churn surveys are useless and that customers lie. But now we did something different. We created a survey for the CSMs to answer. They’d point out what happened based on the customer history and the negotiations they had before the cancelation. Another great way to survey customers who churned, is sending it two or three months after the cancelation and making the customer anonymous. So the customer feel more confident that there is no one trying to convince them to buy the product again.

Those surveys showed us that 60 percent of the churns were happening, either because lack of adoption/implementation or expectations failure. Which means 60 percent of the churns were linked to the very beginning of the customer lifecycle. Either they were expecting something the product couldn’t offer, or they had failed trying to implement it and extracting a first value out of it.

Yes, there were other 40 percent of customers that were leaving because the product had had some bugs, the competition was cheaper, the service had been poor, etc. But each one of those reasons represented too little, and they were very different from one another. We needed focus, focus on the one thing that would solve 60 percent of the churn rate. That focus was the initial value the customers were not seeing.

Onboarding Success

When you think about the initial phase in the customer lifecycle, you think about the onboarding phase. It’s funny that a lot of people in the customer success industry say you can’t accurately measure the success of an onboarding project or the impact of it on the churn rate. I also used to think like that. But it turns out it is actually easy to measure the impact, all you need is a checklist.

First you need to understand the purpose of any onboarding phase, which is to remove the future barriers (configurations) and to deliver a first value. Those two objectives must be broken down into a task list that will be used in the implementation project. Make sure the completion of every task is represented by an action the customer must do in the software. Either the task is the action itself, or the task will enable the customer to do some action.

Since you can measure any action done in your software, you can verify when all the customer’s “checklist” has been checked. This means the onboarding phase was successfully completed.

Now you can assess the impact of unsuccessful onboarding on the churn rate. Separate the customers into two buckets: the ones that have fully completed their “checklist”, and the ones that have not. Next verify the churn rate for these two different buckets of clients.

When we did that at RD, we observed that a successful onboarding was reducing the chances of churn by one third. That was a huge impact. To make it more critical, only 60 percent of RD customers were successfully completing the onboarding phase. “Completing the checklist” became a really important leading indicator to pursue.

The funny thing was, the RD Services Team was doing a pretty good job on the implementation projects. More than 90 percent of the onboarding phases were being successfully completed on the estimated time. We dug into the numbers and discovered that the “mandatory” onboarding package was being sold to less than 60 percent of the customers. When we asked the Sales Team why that was happening, they said that some customers already knew how to implement the software and didn’t need to pay for the implementation. Certainly, that wasn’t a fact. Then, only completing onboardings was not enough to us, the rate of how many customers were buying the implementation package became another leading indicator to pursue.

The Customer Journey

We were dealing with the problems the customers were facing in the initial phase of the customer lifecycle. So we thought to ourselves we should go beyond the implementation phase and think about what the second logical step for the customer was. Thinking about logical steps on the customer lifecycle means thinking about customer journey. There are a lot of complex methods and models to design the customer journeys. We just used our own experience and our guts to define the second step. That would be sending email marketing campaigns.

Yes, you can bet we chose that because email marketing was a sticky feature as I wrote before. But there was a bigger reason. RD is a Marketing Automation platform, the customer has a whole world of possibilities to use our software for. We could have defined that the best step for him to take after the onboarding was content production. After all, almost every action on an Inbound Marketing strategy relies on the content strategy. But content planning is hard to do, it takes time, it needs focus. When you’re designing the steps of a customer journey, you must balance the value the customer has got so far with the effort you’re asking him to put on the next step. A lot of our customers were freezing in front of the content planning task, they’d postpone it the most they could, and months after, they would blame our software for the lack of results.

We’ve decided for email campaigns because they were easy to do, then, quicker the customers would be realizing value from another part of the software. That would increase lock-in and raise the customers’ commitment to trying other harder tasks in the future.

With the second step of the customer journey defined, we’ve drilled down into some of the email campaign indicators to better understand how to make the customers stick to it. The numbers showed that sending a bunch of email campaigns was not enough. Frequency was essential. Customers that were sending at least one email campaign per month had a 0.3 percent month over month churn rate, while customers that were sending one campaign every five months had a huge 10 percent month over month churn rate.

The email quality was also a leading indicator. Customers that were having poor email campaign results, with high spam report rates, bounced email messages and low open rates were churning faster than customers sending good email campaigns.

The frequency and the quality of email campaigns became two strong leading indicators to pursue.

The Plan That Worked

Now we had leading indicators that were clearly more correlated to the customer’s success than the customer’s churn. Also, they were all linked to only one focus, which was the customer lifecycle initial phase. So we had the foundation to create a structured plan to attack the problem and improve the leading indicators. And so we did.

The best part of the plan we created involved practically everyone in the company, not only the Customer Success Team. Cross functional customer success alignment inside the company is essential for any churn prevention plan. Every team must know how their work affects the customers’ success and what they can do to increase that success.

Our plan set leading indicators for almost every team, and those leading indicators were linked to the teams’ goals and OKRs.

The Sales Team had to sell paid implementation packages to 90 percent or more of the new accounts. The Services Team had to successfully complete 90 percent of the implementation projects in the estimated time. CSM Team had to increase the number of customers sending good email campaigns. The Product Team had to reduce the number of steps and time spent on software configuration and on email campaigns. The Engineering Team had to increase the email campaigns deliverability.

Although the plan seemed solid, CSMs were really concerned. They asked me how they were supposed to reduce the churn rate by half when it had been increasing consistently over the last few months. They weren’t confident about what to expect. Here is what I said to them at the time was: give me the leading indicators — they were more important than the churn rate itself. I told them it wouldn’t be any good to miss the churn goals again. But it would be way worse not to be confident about the future, not to be confident that the customer base was getting healthier and the customers were having success. So they did that: every team achieved its leading indicators goals consistently month after month. Thus, achieving the churn rate goal seemed easier and easier, until it had dropped by half of what it was a few months before.

The lower churn rates increased the confidence the VC funds had on us, but more than that, they increased the confidence on the excellent leading indicators results. These indicators showed that we would keep improving retention rates and our SaaS business was solid. Exactly what they needed to close a 20 million dollar investment round.


A similar plan can be made on any kind of company. The fundamental logic for that plan to work is: Stop firefighting the churn and start working on leading indicators to your customers’ success.


You can check my blog manifesto here. I'll share more experiences like that, so stay tuned. Follow me and the publication below.