Self-hosting DTM Libraries

Are you an AEM architect or developer? Then you have probably heard of the DTM Cloud Service, right? You might also know that you can use that Cloud Service to automate what DTM calls “Library Download”.

If you are working with any other CMS, you can obviously also download the Javascript that DTM generates and then host it on your CMS and tell your templates to load DTM from there.

There are good reasons to do so, some of which my colleague Pedro Monjo has discussed in his article on Enhancing the security of your DTM implementation a few weeks ago.

I agree with Pedro on one point: whoever works in DTM to create or manage the setup cannot easily work in an environment where Library Download is being used (unless, and this is so far unheard of, the staging libraries are downloaded automatically and very frequently (like every minute or so)).

Just like Pedro, I think a hybrid approach is better: use the standard DTM-hosted libraries on any dev or staging environment, then build a workflow that loads whatever has last been published to your production environment.

So far, so good. But let me add a couple of completely opinionated aspects.


(The rest of this post is a list. According to folklore, I should craft a nifty headline with the number of items in it, such as “xy opinions I have about how you should do your job”. But I digress…)

Ⅰ — Pedro’s method of changing the host names using scripts is utterly unsupported. If you do that, don’t tell your marketer. Let them call support in the (false) knowledge that their setup is completely ok. Also: I’ve been doing it, too.

Ⅱ — Pedro says all settings for report suite IDs (rsid) should point to dev or UAT, and the scripts should overwrite them. Let me add another option: create a Data Element called “Report Suite ID”, Custom Code, including code that looks at the URL and the _satellite.settings.isStaging attribute, or whatever else is needed, then computes an rsid. Make a doPlugins method and in there, use to set the rsid. Much more flexible.

Ⅲ — Talking about report suites: if you have three environments (dev/UAT/prod), I suggest you have two report suites (dev & prod). But: if you use AEM, I suggest you track everything that happens on Author into additional, separate report suite(s)! What content authors do is not what your visitors do. Why track it into the same pot?

Ⅳ — Actually, if I had both, an Author and a Publish environment, I’d love to know how they both work. Meaning: I’d treat Author just like I’d treat any site — find out how to measure my goals (what are my goals?), track accordingly. That means the tracking on Author will potentially be completely different from that on Publish! I’d say we should make that distinction in DTM, too, by using a separate web property for Author!

Ⅴ — We’re now at a setup where you only need to host the Javascript for your public web site on your servers. All other sites (Author, Publish UAT, Publish dev) will only ever be used by your colleagues, and likely only ever from within your corporate network. So: only the public Publish instance needs Library Download, while all others can happily load DTM’s Javascript straight from DTM.

Ⅵ — Here’s a question for you: if you use Library Download, you also have a process that actually signs off on the JS libs before they go live, right? So who is doing that? And how? In the cases that I have seen so far, this was the weak link. Dev/Ops would simply refuse to take on the responsibility of signing off on code that has been written or generated by marketing. Keep in mind that without a (lean) process, Library Download can mean you don’t use DTM to its full potential.

2 thoughts on “Self-hosting DTM Libraries

  1. I agree on using the Library Download for PROD and the hosted version for dev environments. That’s how we are currently transitioning our process to. As the person using DTM on a continuous basis all day long I want any change I make in DTM to be readily available on DEV as soon as I save a rule and refresh the page, instead of having to wait for the process of downloading the library again and then uploading to our servers. Our IT was choking on having that happen every time I saved a change, as they wanted at least 10-15 minutes between downloads/uploads. As a DTM developer that timeframe was unacceptable to my development process, so we were both fine with leaving the DEV environments using the Adobe hosted version.

    Something IT did give me on all environments is the ability to use our CMS to set the DTM reference url on every environment individually, including the personal dev ports of our developers. This gives me the option to easily place a different DTM Property on a specific environment if I want to do a very different configuration or test, without affecting other dev environments being used by hundreds of other people. And I can make that change without IT involvement or build cycle. It also serves as an emergency fallback in case our internal download/upload process starts to fail for some reason, as we can quickly switch to the hosted version as we address the internal issue.


    1. You have very cooperative IT people, congratulations!

      I like that way a lot. Using different Web Properties makes your job possible and also make it completely transparent for dev and IT, which can only be good.


Leave a Reply

Fill in your details below or click an icon to log in: Logo

You are commenting using your account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s

This site uses Akismet to reduce spam. Learn how your comment data is processed.