When the Space Transportation System (STS, vulgo “Space Shuttle”) was first planned and NASA campaigned for it, cost reduction was a — if not the — main driver.
The goal, as presented by NASA to Congress, was to provide a much less-expensive means of access to space that would be used by NASA, the Department of Defense, and other commercial and scientific users.
Sounds like a solid, goal-driven plan, doesn’t it?
When we implement Analytics (for the first time), we often go in with a similarly solid plan. The complexity of a Space Shuttle program is way beyond that of an Analytics project, but some of the things that turned STS into a much more expensive project happen in Analytics projects, too.
Nasa should have been in a good position when they started the Shuttle program.
They were about to show that they could rise to a challenge (putting someone on the Moon), and at the time when they thought about the Shuttle (this was shortly before Apollo 11), I think they new it would happen. When the STS program formally started in 1972, they had a bunch of Moon landings under their belt, plus Skylab, a simple space station.
They were, by far, the most experienced agency out there, way ahead of their Russian counterpart, who had spent all their energy on two, concurring, heavy-lift launch systems, producing some spectacular failures.
If you think about what happens in some Analytics projects at the beginning, you can guess what happened: people wanted things.
Stakeholders vs. Focus
The Space Shuttle program, because it was planned for NASA, the DoD, and “other commercial and scientific users,” was buried under a long list of requirements, mission parameters, and other wishes from the different stakeholders.
I have seen that in so many Analytics projects: you make or use RKOs or a KPI framework to come up with a simple plan for Analytics, only to be bombarded from all sides with requirements that are dear to someone’s heart.
Don’t get me wrong: it is extremely hard to resist that! We can and must challenge every single one, but ultimately, we often fail and have to give in.
On a somewhat related note: this happens the whole time, and that’s why I think pruning is so important.
For the Shuttle, requirements like being able to Launch from Vandenberg AFB, or being able to change orbit whilst in orbit, made the system more complex, and eventually much more expensive.
An analytics deployment will also get more expensive with all those “pet requests,” but the real issue with such an analytics project that the more you pile on, the less useful it will be.
There is such a thing as not seeing the wood for the trees, especially with an application like analytics, and the simpler you make it, the easier people will be able to hop on board.
And the more people hop on board, the more use you get out of analytics.
None of this is news, I guess. Jim Gordon wrote about the Semantic Implementation, and I love how he put it:
In a semantic implementation, we should be asking: “Would this make sense to a junior analyst?”
I wrote about it, too, probably more than you would’ve liked me to. Simplicity and purposefulness is what makes analytics great!
When I say that you should not use custom code in your tag manager, I am trying to nudge you to think about this not from a “sure, can do!” point of view, but more like “does it add anything to the usefulness or will it just be more complex?”
There is a similar technique that people use when they want to become better photographers: limit your options. Try to see what you can do with minimal equipment or tools.
Jason Thompson of 33 Sticks posted a small clip of the nature around his place, which reminded me of the times when we took our SLRs and only the 50mm lens, and we tried to find out how to take great pictures with just that combo.
In the end, focussing on one thing within hard limits helped you figure out things, and it made you a better photographer.
I am 100% sure that the same is true for analytics, and this, btw, is why people trust Google Analytics more than others out there: the standard GA implementation is so simple that noone can imagine it to be wrong.
Simplicity, people! It rocks, and it makes you beautiful and smell of roses.
I didn’t check, but I think it must be some time since I last wrote about simplicity, explicitly.
While it is a somewhat frustrating story, it’s good to know that even people like NASA can get it wrong.
Now go and do better. Thank you!
4 thoughts on “Analytics Project vs Space Shuttle”
I advocate for simplicity all the time, to often limited effect. I’ve found too many people equate complex with quality and depth. They sometimes make such complex measurement plans that it inevitably confuses the analysts and business people, and makes the code base mind-numbingly hard to understand and maintain.
Most business questions don’t need super advanced AI level complexity to get a reasonable answer. And besides, it’s unlikely your highly paid Data Scientists (if you even have them) are going to spend the time and effort to do deep dives into anything but the highest priorty questions anyways. So you will be left with confused and overwhelmed Digital Analysts trying to make sense of overly complicated data, which often will be kinda unreliable anyways because the code producing it it overly complicated and subject to failure and degradation.
Often a recipe for disappointment and heartburn.
You’d think that as a profession, we’d learn, but I guess we’re still young.
Here’s hoping it’ll get better.