Let me throw out an idea here. Given that you are a developer, I hope you can a) appreciate the idea and b) help me develop it into something that works.
The idea is to use test-driven development for Analytics or maybe for digital marketing tagging in general.
I recently attended a conference in Berlin where we discussed “agile” in the context of analytics. The participants had a lot of ideas and were trying hard to grasp the meaning of “agile” in our part of the world.
Unlike the other discussion rounds I attended, this one was more like poking around in the dark, and I think we all felt it.
The truth is: we are so far away from being “agile” in analytics that noone really has a clue even where to start! Most people do not know what “agile” means in this context. And it must be difficult for a marketer to see why they should work “agile”, whereas for you, it is almost obvious by now.
So, could we pick just one aspect of “agile” and start with that?
Funny enough, we sort of have already. I am not the only one who says that the implementation phase is NOT key. Consultants everywhere use the “crawl, walk, run” analogy to describe how an organisation should approach new stuff.
Starting with a small item (plan, deploy, learn, understand, use) is making it easier for everyone to do things and limits the risks.
And starting with something small is almost like working on a story, isn’t it?! Or at least it could be, which is why I think this might be the easiest way into the whole “agile” world.
So the idea would be to take it a step further and formalise the way new functionality is introduced into the analytics setup (or the overall digital marketing deployment setup).
Let’s use “I want to know the impact of my paid campaigns on newsletter signups” as a requirement and acceptance test case (if we did ATDD).
We would add that test to our analytics framework.
If we ran it, the test would obviously (hopefully!) fail. Good!
Now both these rules will also fail, of course, but at this level, a developer (you!) can do something about it and provide the code to fix test 1. Your friendly marketer or an analytics administrator can fix test 2.
This way, the big, overall acceptance test will eventually pass.
Up to this point, nothing has helped you.
For this idea to truly make a difference, there must be automated unit tests, built into a framework. That’s a job for the vendors, of course.
But the moment this unit test framework exists, you will have two important advantages:
- no more releases that accidentally break something
- explicit documentation of requirements in the system
There are a lot of challenges and some great opportunities here. Off the top of my head, and to open a discussion:
- Instead of writing “Tech Specs” or “Solution Design References”, consultants could essentially sell acceptance tests or unit tests. They’d be documentation and goal specification in one.
- Tests can be extended as technology evolves. Think about the new Visitor ID Service. If you created a test before migrating over, you’d be a lot less stressed out about it, right? Plus all your existing tests would still be there, alerting you if the change broke anything.
- If you think about the processing of hits, you can see that there are multiple points that tests could hook into. The simpler tests could look at (or live in) the
doPluginsmethod in the
s_code.jsfile, whereas more involved scenarios would have tests that looked at the processed data, maybe using the Reporting API.
- Of course the backend would have to be tweaked to enable testing in real time. Maybe the new “Live Stream” feature could help with that.
All of this is useful and makes sense from the point of view of a developer.
The one big thing that the marketer gets out of it is the knowledge that everything will be fine because the developers know what to do and they won’t break anything.
Eventually the marketer should also realise that by working with small, contained stories, she gets things more quickly and she will know up front what she’ll get.
I think there are a lot of unknowns. Apart from the changes to the backend to enable all of this, there a surely some conceptual issues I have overlooked or maybe the whole thing is utter nonsense. But I think there is something to it, and I’d love to hear your opinion and suggestions!
9 thoughts on “TDD for Analytics, please”
Hi Jan! Thank you for the article. Very good and important points. Makes a lot of sense for me as a developer. I wonder, have you seen anything like that implemented in the wild?
I haven’t seen anything like this in the wild, although there are some tools (e.g. ObservePoint) which help with testing.
None of them do anything close to TDD, and no tool out there comes with the necessary framework to implement TDD, as far as I know.
Hope that’ll change one day…