Why You Should Spend Time on Analytics for App Building

The presentation covered the reason why we started performing analytics. We provided a case study about how we improved the usage of our block screen with data driven development, what we did first in this journey, the metrics we used and continue using, and finally our leanings.

This year Truecaller had the opportunity to open the DroidCon Stockholm. Our Android Software Engineers Sergio Cucinella and Dmitry Avchukhov held the keynote presentation about how we do product analytics at Truecaller.

The presentation covered the reason why we started performing analytics. We provided a case study about how we improved the usage of our block screen with data driven development, what we did first in this journey, the metrics we used and continue using, and finally our leanings.

Nowadays, everyone involved in the development of a product is always interested in knowing the current state of project and which is the impact that the daily changes have on the product itself. But it is not really clear which are the metrics that can be used to fully understand these things. As a matter of fact, it is quite hard to determine a minimal and unique set of metrics that could be used by anyone to answer their questions.

In order to show even clearer why analytics can be a handy thing to have in our products let consider the following example:
An application is published on the store, but it does not have any analytics in place. After a major update it experiences a growth in the session length as depicted in the graph.
Now, without analytics it is impossible to know whether the growth is happening because the users are blocked somewhere, because they do not know either what to do or how to use the app, or if they are fully enjoying the product and are freely browsing the content.

Case study
One of our key metrics is the number of calls blocked. When our users block numbers they not only make their life easier, but also contribute to our data sets, improve quality of our services and help other people to fight against spam. Our goal is to keep the amount of numbers blocked high and growing.

As we don’t want to make the decision for our users, we have special settings with ability for our users to fine tune their blocking settings.

An observation we noticed, was an amount of users who changed their blocking settings was quite low, and only 25 users of the selected group reached the block screen. Furthermore, only 31% of users reached the block screen after a successful search in the app.

After short research next facts came in:

  • block is disabled by default
  • it’s not so easy to find block screen and turn on blocking
  • there is no reminders and call for action for users and they probably are even not aware of existence of spam block and possible dangerous.

We spent some time investigating the issue and found out a few good solutions. The smallest change was decided to be implemented. And this change was about adding red dot (badge) on block tab. This badge is active when user is not protected from spam and scam calls.
It took us about 15-20 minutes to implement and test this feature and we expected improving of our metrics immediately after release.

But we couldn’t expect that impact would be so high. Introduction of small red badge increased amount of users reaching block screen 2 times and conversion from users performed search to users opening the block screen increased by 30%.

As we said before, it has been a journey of some months that brought us to where we are standing now. As a first step, we added some basic app usage tracking. After that, we introduced tracking of events related to the main app features; by doing so we built our first conversion funnel. We continued adding incrementally more events and so created our tag plan (the list of all events tracked throughout the whole app).

The next step was to identify where to send all our tracked data: our own solution or a 3rd party. The choice is clearly very subjective, because it really depends on the required capabilities. Some 3rd party solution provides you with tools for supporting marketing campaigns, for tracking users acquisition campaigns, for A/B testing, some of them are free others charge on data points (amounts of events sent). We decided to send business sensitive data to our infrastructure and to have a trial period with both Localytics and Mixpanel.

Once we had everything in place we started to create and distribute internally the first builds with analytics integration. This helped us collecting the initial data and to verify its validity and reliability.

Finally, when published an update with the analytics integration to the users and started analyzing the data and stats on the dashboards. The dashboards are extremely helpful for monitoring trends and spotting possible bottlenecks and problems. We periodically review all dashboards and identify whether to introduce new tags for better interpreting how our features are usage.

Let’s Talk Metrics
Let’s talk about the metrics that we used, keep using and how they help us to build great user experience and improve our product.

We use 3 main groups of metrics:

  • growth metrics
  • product metrics
  • quality metrics

Growth metrics
Growth metrics are top level metrics for our products. They show value of incoming users. We track downloads (from different stores and preinstalls) and amount of active installs.

Product metrics
We use product metrics to determine current state of our product, find direction of product improvements and estimate recent changes. Besides common metrics like activation rate and retention we have specific metrics connected to our specific features: number of searches, number of call blocked and hit rate (percentage of successful searches).
We separate product specific metrics and business specific metrics. Both of these subgroups are stored in different places and used differently.

Quality metrics
Quality metrics are the way to monitor overall quality of our applications and users happines. We monitor stores rankings and reviews, crash/error rates, we do surveys to find and fix major problems according to early feedback from our users. Our community and ambassadors help us to keep applications quality high.

Our conclusions are that it is good to implement data driven development because it provides a lot of tools for pushing your product further. Believing that you can get the right metrics and info from the beginning is wrong, are required few iteration and tweaks. Generally speaking it is a never ending process that involves periodic review.