You are a developer – you develop! Also, sometimes, you debug.
Our article today focusses on the latter: how to debug when you are implementing Adobe Analytics.
Let’s start off by finding out what exactly we can expect to debug.
Adobe Analytics has a lot of moving parts, some of them client-side, some “in the cloud”. Debugging the latter is very much an end-to-end test situation, while the former can be debugged “traditionally”.
In the article about variables, we explained how you send data into Adobe Analytics and that it eventually ends up being parameters of the URL in an HTTP GET request.
That makes it fairly easy to debug. You can use all sorts of tools to see what your browser actually sends:
- The Adobe Debugger,
- The developer tools your browser provides,
- Any browser plugin like Firebug,
- A proxy like Charles, Fiddler or Bloodhound, or
- Low-level network sniffers like Wireshark.
The big plus is that it “decodes” the information it finds in the URL.
You can clearly see that this specific page we’re looking at has set
If your browser has built-in tools or you are using a tool like Firebug you can look for the HTTP request and check the parameters.You won’t have the variable names.
s.prop1will instead be
v9. The documentation has a list of all symbols.
Apart from that, the actual data you see is exactly the same!
It is also exactly the same if you debug using a proxy like Charles or Fiddler.Proxies have advantages: they can show you a timeline of requests, not just what the current page sent out. Especially useful when we get to link tracking.
They can usually save a file containing all HTTP traffic they grabbed. That means you can load it back and look at it later, in case you’re not sure about parts of it.
Also, proxies let you tweak the requests, the pages or even the
s_code.js file on the fly. (Look for a feature called “Map Local” in Charles!) Extremely useful for testing changes to the
s_code.js file before putting it live. Don’t forget to switch it off after testing!
As we said above: there’s not a lot of debugging you can do server-side, mostly because the processing really is a black box. Also, there are a lot of settings that govern how the system processes the data once it has arrived, and some of those settings you will probably never touch when you implement. It’s a bit of a grey area. Or gray area, if you’re in the US.
Our suggestion: run controlled tests!
Here’s how it works:
Say you just made a change to your
s_code.js file. From now on, your site is supposed to pick up the “fb_ref” parameter for links coming from Facebook and put the value into
- Think of a use case and define a step-by-step test.
- Determine what data you expect to see after the test (this is crucial!)
- Run through the test.
- (Bonus step: check the value in
s.campaignusing one of the tools mentioned above.)
- Log into Adobe Analytics and compare the data you see with what you expected.
- Make sure you give the system some time before you check the results (see Latency in SiteCatalyst 15 for details).
- It is imperative that you figure out the expected results before you run the test! I guarantee you that if you run the test first, you’ll be able to explain the results to yourself, no matter how wrong they are. It’s called “confirmation bias“. I even sometimes look at the data and my expected outcome and try to argue that my expectations were wrong. Don’t fall into that trap!
So there you have it, now go and develop something so you can use your new-found skills to debug it!