Load Times & DTM

[Screenshot]

If I say “BAADFX”, do you flinch?

You’re a developer, so I guess you should.

I’m talking about the results for your pages on webpagetest.org. Each letter is the result of a specific category of the test. A is better than F.

In the example above, images should be compressed (the D in position 4), and the caching is not good at all (the F in position 5). Also, the site does not effectively use CDN (the X in position 6), meaning all content is loaded from a single server or cluster, rather than a content delivery network potentially making RTT higher.

[Screenshot]
Results of a WebPagetest
There are other, similar test tools out there, notably Google PageSpeed Insights.

Tests like this matter, because they are an indicator for the user experience of your web site. Ultimately, if your site is loading awfully slowly, visitors are likely to try their luck elsewhere. On top of that, Google doesn’t show slow sites at the top of the search results, even if they are the most relevant results.

So there is a lot of pressure to get your site to “AAAAA✓”.

Why is that relevant for this particular blog?

Because when your friendly marketer does her job right, she’ll need instruments on your site, and those instruments are potentially lowering your test results. And because often, DTM (or whatever tag manager you’re using) finds itself at the centre of the discussion.

What?

I’m going to concentrate on DTM, simply because I’m familiar with it. I’m also sure there are similar posts out there for other tag management systems.

So, what’s up with DTM and page load times?

Actually, what is the whole page load time thing about in the first place?

Ideally, a visitor should see your web page as soon as she hits the link to it or typed in the URL. The faster she actually sees something, and the faster she can interact with it, the lower the risk that she’ll give up on you. Easy to understand. But why would it take long?

The biggest factors would currently be:

  • Number of resources to load
  • Size of resources to load
  • Amount and complexity of scripting on the page
  • Load order for resources

I put the number of resources first, because a) I believe HTTP round trips make more difference than file size, at least for scripts, HTML, and css, and b) DTM plays a role here.

Whenever you or your analyst or friendly marketer add script into DTM, the system stores it in an extra .js file, which it will only load if it is needed.

While the naïve idea would be that when you use DTM, there would be one JS file to load for DTM itself, plus maybe one per tool you deployed (one for Analytics, one for Target, etcpp), the actual reality is that on top of those scripts, you’ll see DTM load the smaller ones, too.

[Screenshot]
PLR with scripts in DTM JS
Note that while “Non-Sequential Javascript” and “Sequential Javascript” scripts end up in their individual files, “Sequential HTML” does not! Neither do Custom condition code blocks.

The files that DTM generates are pretty small, and DTM only loads those that are actually needed on any given page. This is why those scripts end up in separate files in the first place.

As I wrote earlier: the size of those scripts is likely not a problem, but the number of GET requests to load them might be.

If you have a lot of those files, and if you want to optimise, you have two options:

  • Consolidate — change your rules so that you use less scripts overall. One way of doing that would be to put scripts from different rules into one rule that is specifically built for that. Whithin the script, you’d handle the actual decisions whether to use a spcific part of your consolidated script or not. Another aspect would be that if a rule has more than one script, move them into a single script.
  • Caching — DTM let’s you host the code on assets.adobedtm.com, but it also let’s you download the libraries and host them yourself. If you do the latter, you have complete control over how you want to cache your scripts, i.e. how the expiry headers are set.

Consolidation is the harder approach. It takes planning, documentation, and a good deal of discipline.

You’ll likely end up with Page Load Rules that are more general than usually (say a rule that not only covers blog articles, but also the blog article category pages). In those PLRs, you’ll have one script, and inside that script, there will be some logic that decides whether specific parts of the code should actually execute on the current page.

You have effectively moved from using specific PLRs and conditions to generic PLRs with logic in Javascript.

Caching is easier to do from the perspective of someone who manages Rules, Data Elements, and code in DTM.

It has disadvantages, too:

  • Need a workflow to push or pull DTM script onto your host
  • Developing with DTM is more complicated, since staging code is not immediately available on the live site, or not at all (Charles can help here)

Caching is most effective if your visitors come to your site often. On the first visit, people will still have to download everything, so if your site is mainly attracting people once, don’t bother.

For companies who have stringent rules on Javascript in the site, library download is often the preferred option. If that’s the case for you, I’d say look at how you might be able to change caching for the DTM scripts.

Notes

It is important to remember that while we want a site that loads as fast as possible, your friendly marketer really needs a lot of the stuff she deploys to do her job. A fast site that cannot be optimised or personalised is somewhat of a gamble.

So while page load times are important, they’re not everything!

Most of the times, the different page load time measurement tools out there don’t even agree!

When I test my own private site with webpagetest.org, I get AAABBX, which is not too bad. But when I test it with Google PageSpeed Insights, I get 15/100 for mobile, and 18/100 for desktop, which is abysmal.

[Screenshot]
Google PageSpeed Insights
Seems like Google is a lot more stringent, or more likely, it places the focus on other things.

Before you ask yourself “but which tool do I trust?”, may I suggest you take a step back?

Before you throw all your resources into this endeavour, keep in mind that you have your own test tool right in front of you — your browser.

If you load your page, how does it feel? Fast? Good, you’re done.

2 thoughts on “Load Times & DTM

  1. How about the fact that Adobe says we need to keep the main satellite script in head?

    This is a deal breaker for lots of dev where performance is very important. Also the TTFB for adobedtm.com via Akamai is terrible. I usually delay my site with 1.5s because of DTM.

    Like

Leave a comment

This site uses Akismet to reduce spam. Learn how your comment data is processed.