Some time ago I wrote an article in which I dreamt of having some sort of test-driven development capability in online marketing tools. At the time, I said the vendors had to chime in, building hooks, frameworks or APIs into their systems so users and consultants would be able to be test-driven.
I had discussions with a colleague in product management about it, but before we could get anywhere, that colleague left Adobe, so I forgot about the whole thing, reluctantly.
Fast-forward a year and we now have a tool in the Marketing Cloud that might help revive a scaled-down version: Dynamic Tag Management (DTM).
DTM adds flexibility to the deployment, allowing marketing teams to at least partly relieve IT and development.
Somehow, the combination of flexibility on the input side paired with real-time access to data on the receiving end makes me think there should be something we can do.
The flexibility also means that marketing is less restricted by IT / dev and can do things independently. That in turn means testing is now more necessary than ever before!
I think this is one area where our profession is way behind everyone else. And I think it hurts us. So I decide to think out loud, and maybe someone will chime in and we will develop something.
With that out of the way, and given that you’re still reading, I can presume you’re brave. Or bored.
Let’s see what we need for TDD and what we might have. The following list is from the Test Driven Development FAQ on pantras. I have only taken the simple things, let’s start small.
- Unit — this is the “thing” we want to test. It is probably not a piece of code but rather a part of our setup. “Am I actually tracking my campaigns?” might be a “unit” in our case.
- Test case — the definition of a test. An example might be “the page URL contains a URL parameter cid=’test’, assert that ‘test’ shows up in s.campaign”.
- Fixture — the fixture is a defined environment and state in which the test can be run. See below.
- Assertion — the assertion is where you tell the test framework what you expect the outcome to be. For my test case above, it would be “assert that s.campaign contains ‘test’, or even “assert that we can see one instance for ‘test’ in our Tracking Codes report”.
- setUp & tearDown — two methods that create the fixture before the test and dispose of it when the test is done. The setUp() method creates everything that is needed for the test, a defined state. In TDD, the fixture should be as simple as possible. There is a risk that a complex fixture itself might not be correct, which would render the test useless.
- Refactoring — the TDD mantra is “red – green – refactor”. If you want to add functionality to your system, you first make a test, which will likely fail. Then you do whatever is the simplest way of making your system pass the test (yes, often that means hard-coded values! putting
s_code.jsfile totally counts!). Once it passes, you refactor. Refactoring means removing programming sins, like hard-coding. You know that your refactoring is good as long as the test stays green.
- Mock object — mock objects simulate parts of the system in the tests, usually parts that are “expensive”, such as DB transactions or disk I/O. In our case, Mock objects could replace visitors or 3rd-party processes such as emails being sent
- Red bar & Green bar — “the bar” tells you whether all tests were successful (green) or not (red). It is a visual representation of your system’s behaviour in the test. Would be great to have something like this!
I won’t claim that we can do actual TDD. Our “Units” are far too abstract. There are too many black-box components involved, and you have no control over most of the technology involved.
Of the 5 rules of thumb of when a test is not a unit test in the above FAQ, we will violate at least 4.
But I maintain that some testing is better than no testing at all.
While I have been writing, editing, re-writing, and thinking about this article, Craig Scribner published an ebook on auditing Adobe Analytics which covers some of the pain points I see. I am very happy that I am not the only one thinking about the issue of data quality and specifically “quality rot” in Analytics.
But what Craig’s ebook shows is that there is a need here. And there are people around the world who want this.
My current plan: I think the vision is pretty big in terms of moving parts. It might be to big for any of us to stem. The best answer to something like that: de-scope, radically.
So that’s what I’m currently doing. Thinking about where TDD can easily be done, and where it would have the biggest impact.
I absolutely love Craig’s idea of logging an audit score in Analytics, because — as he points out — it reinforces the point that the data is valid. It’s steps like this that will help us get there, I think.
Goals & Stakeholders
I discussed testing with two colleagues tonight, both working as consultants or architects on AEM, the Adobe Experience Manager. Their primary concern is: how do you serve content, fast and efficiently. They are very familiar with testing, obviously. They use selenium on the specific project we discussed.
But what I realised tonight has nothing to do with tools and everything with stakeholders, goals, and needs.
Which is great, because it helps me deconstruct and de-scope my problem further!
The epiphany Craig Scribner described in his post on Direct Call Rules is a really good description of this problem.
I still think the actual issue is not the tool, but the fact that we are so decoupled, and I think we can fix that. Because both, CMS people and measure people, can easily break the code, both need a means of knowing when they do.
CMS people and measure people have very different needs, in fact, but both want to know immediately if any change they made had a negative impact on what others are doing.
I feel that decoupled is good, as long as there is a “contract” that both sides adhere to. Tests can enforce that contract.
Depth of Tests
The thing is: IT and CMS people do not care what they have broken on my end. All they want is to know if they have.
That level of test is a lot easier than a full end-to-end test (which would help me, of course)!
So I think if we want to keep our IT / development happy, we can probably relatively easily provide test cases to them.
What would those cases have to test for?
I can think of two things off the top of my head:
- availability of data — do I have everything I need on a given page available when the page loads?
- events — do events that I listen to on a given page actually fire?
The thing is: I would love to test a lot more than those two things, but for IT / dev, they cover pretty much anything they can break and are therefore all we need to test for them!
At this point, and I don’t want to reveal too much too early, I use CasperJS to run tests that check all Data Elements on a page. I am essentially halfway there!
For IT / dev, those tests should be part of the build process or the continuous integration, which with CasperJS, they are not. I therefore have to get into selenium. Yet another tool. *sigh*
For me and your friendly marketer, I prefer a standalone tool. I want to be able at any moment to push a button and see a green bar, independently of anything else, like Marketing likes it.
I guess the Data Element tests will have to be made using a tool, as well as selenium. And then the tool has to do more in-depth tests that check what is actually sent into Analytics, or even what comes back out of it.
I just wanted to throw this out here, partly to give you an update, partly to solicit some response.
What do you think? Am I doing the right thing here? Is this a waste of time?
I know that I am not alone, at least. My friend Lukáš Čech is working on something similar.
My take: in web analytics, we have never done anything remotely akin to TDD, and the agile manifesto is for us what the Internet is for German politicians: “Neuland” (“unchartered territory”).
In essence, while you (being a developer) might know what agile means, your friendly marketer and all the agencies and consultants she works with decidedly don’t. I’m asking you to help us. Small steps. We need it.