Analytics is not a Debugger

The subject today should be very close to you if you’re a developer, at least if you are interested in user interfaces and how people use what you develop. I will frustrate you, though. Dip you into cold water and laugh.

I might pull you out again and sit you down in front of a fire place with a cup of tea or a beer, later. We’ll see.


Let’s look at an example:

Your company is making this versatile tool. It caters to new users through a simple UI, but there are also power user features, and there’s obviously always a bunch of ways to do a certain task. It’s not a simple tool, but not a big one, either.

People like it. You have had good success selling it.

For the future of your product, resources have to be assigned, new features built and some amended or even removed.

How do we decide? We should find out what people use and what they ignore. Build up the former and sunset the latter.

Sounds like a job for analytics!


Nope, it isn’t. This is not what Analytics does.

Let me qualify that.

I used to develop for PalmOS years and years ago. When I came across a bug, I would load gdb and then have access to the complete state of my code and the Palm. I could poke around until I found what was wrong.

You’re developer now, things have surely changed a lot, but I’m sure you still debug, and you are still used to having access to the complete state of your environment and application, even if your tools are no longer as archaic as my gdb used to be.

That means that your way of looking at issues is fundamentally different from what a marketer would do with Analytics. Web analytics tools are not like debuggers. Not at all!

Specifically: you do not have the full state, not even close.

Too Much

You have abstract data, that’s it.

You even have a lot of that. Enough of it that it is sometimes difficult to find real insight. The fundamental problem with digital marketing is not access to data, funny enough. Usually, it is the opposite, along with challenges like finding what matters and extracting signal from the noise.

You also do not have automation, in the sense that there is no mechanism that goes through unstructured activity records and derives insight or builds clusters of interesting information. That’s just not what web analytics tools do right now.

So your idea of just tracking every move of the mouse, every click and every key press? Of using all that data later to find patterns of usage? It is unfortunately not going to work.

I will go one step further and claim that you wouldn’t be able to analyse that sort of data anyway. There is too much of it!

So we aggregate.

In digital analytics, we look at pretty high-level metrics most of the time. Sometimes, we want to look at a specific issue, and we might then dig deeper somewhere. But only temporarily, and only in that specific area.

We do not hoard data. We can’t afford to. Remember what I said about trust and simplicity?

Design vs life

You will also learn a lesson that your friendly marketer has learned a long time ago while analysing the web site: no matter how you plan the site or app, users will use it differently.

I once found myself in a situation where an analyst at a bank, looking at the numbers for a credit card funnel, was wondering why the data was showing some 2% of visits starting on step 2 of the funnel. He asked the designer of the funnel, who swore that that was totally impossible. But visitors managed to do it, because they manage to do absolutely everything.

I’d wager that the flow your UX people thought of is not the only way for users to get to the goals. Users do not move in straight lines. They are much more like tired toddlers in a toy store.

And since we don’t know exactly how they will achieve their goal, we start by just measuring the goal, not the path that leads to it. Again: abstraction saves us from data overload.


Examples of high-level measurements we deal with are: number of orders, number of views on our products (probably not all variants of them), number of email sign-ups.

In order to analyse those measurements, we capture data that allows us to segment, such as product category, visitor type (new, repeat, regular), time of day or maybe visitor value (how much have they spent so far), or even visitor type (searcher versus browser or something along those lines).

Using those measurements (“metrics”) and segments, we can try to understand our users, see what kind of user they are (“regular visitor with above average order value”). This helps us develop the business, and it allows us to cater to the segments more efficiently.

The aspect that is probably alien to you here is: we do not use any detailed data at all. All we care about is the outcome.

One of the reasons for that is that if you classify your users into too many groups, each groups will be so small that any improvement for this group will be meaningless. Sell 15% more to a group that is responsible for 2% of your revenue? Why would you do that? You want 5% more revenue for a group that does 75% of the revenue. Now that changes something.


Right. I promised you a place by the fire with a nice brew or a beer.

So let me tell you what you can track.

First of all, those bigger picture features can be tracked. Say you wrote a messaging app, you could absolutely use Success Events for the following features or activities: user opens editor, user writes more then 5 letter, user sends message, user opens received message to read it, user flags message, user hits reply on message.

These give you a good impression of what features are actually being used. Which will probably answer a big part of your questions.

If you need more, you can focus on one of those activities and tag it more specifically. Say you wanted to analyse how people reply to a received message. You would probably want to segment on how they initiated the reply. Did they use the gesture? Did they hit a button? Did they swipe down far enough? Those are good questions, and you should use an eVar to make that distinction.

But once you know, and have changed your app accordingly, you would remove that level of detail until you deal with it next time.

Gathering data for digital analytics is a fairly selective process, completely different from debugging.

I guess a real-life analogy would be this: your car is making some strange noise when you drive it. You can’t see anything from the inside, so you ask a friend to watch you drive past her slowly while she looks for something wrong.

That’s what we do. That’s what are tools are built for. We are that friend. We’re happy to help!


German expat living in Switzerland (formerly UK and France). Consultant and member of the Multi-Solutions Group at Adobe, working with the Digital Marketing Suite. Father of 4 girls.

Tagged with: , , ,
Posted in Principles
2 comments on “Analytics is not a Debugger
  1. Bijan says:

    Hi Jan,
    in this case I recommend to expand the meaning of debugging to usability-testing also 😉


Leave a Reply

Fill in your details below or click an icon to log in: Logo

You are commenting using your account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s

Enter your email address to follow this blog and receive notifications of new posts by email.

Join 1,398 other followers

%d bloggers like this: