Contents
That Feeling
We wake up to that feeling. We’ve got unchecked web analytics reports. We’ve also got some for our app as well, but those will have to wait until tomorrow. Plus, we’re not totally sure if it’s tracking properly. We need those web reports for a meeting but the segment definitions still don’t make sense and we’re not confident in any of the cross device data. So, we decide to take a peek at our recently completed A/B tests and those currently in flight. Most are inconclusive. There’s one that seems promising but it’s only a change in “call to action” text. And even that one is only winning for desktop, so we’re not sure what it fully means. There’s probably some really useful insights in those experiments, but we’re not sure how to get at them. And, plus, there’s no time. So, we move on to our recent usability studies and we find that four people really like big, pretty photos of products and all six users like it when things are on sale. That information is not particularly surprising or helpful but at least we can say that we got user feedback. Slightly dejected from the lack of genuine insight, we go to the one place where we can always make a difference — our performance marketing dashboard. Good news — everything seems to be ticking ahead. Facebook is up .0001% and Google about the same. We’re slightly concerned that the attribution model is not reliable,so we just close our laptop and take a walk. We’ve got a massive headache.
What happened?
The “we” in that paragraph is most everyone who’s ever worked at a twenty-first century company that sells things. The “we” are the marketers and designers and developers and product managers and researchers and analysts and accountants and executives and consultants. It’s everyone — because we are all, fortunately, invested in and oriented around data in a way that we were not twenty five years ago; and unfortunately because the weight and stress caused by the deluge of data and tools and analysis and interpretation is beyond the capacity of human beings, much less one human being. Our obsession with “what happened” and “what might happen” has led to trillions of dollars of investments in software and skilled labor. The former may be a good thing and the latter is probably a wonderful thing. The digital analytics revolution has bred innovation and new careers. And yet, it is probably the confounding bane of most every business I’ve encountered in the last decade. Teams of people wondering “what tools should we buy,” “how do we use them” and “what does it all mean?”
Many years ago, these questions were exciting. In the 1990s, when we were sorting our way through server log files and customer email lists and SaaS was not a thing, Webtrends seemed like a godsend. And then there was Google Analytics and Omniture and we were off. A decade or so later, we learned that we did not need proxy servers or separate web pages to do A/B tests. We could use JavaScript and “DOM manipulation.” We graduated from Offermatica to things like Optimizely and Monetate and VWO and the like. Soon after, we got heatmaps and app analytics and usability testing and dozens of other research, analytics and optimization adjacent tools. And each of those moments felt very exciting and I would not be writing this article had it not been for those companies and their innovations.
What started as nothing being done by no one two decades ago, was suddenly many jobs being performed by a half dozen solutions and teams of twenty or fifty or hundreds. Every tool and every person bumped into another tool or person. There was exponentially more information but probably no greater clarity or insight. The more access we had to data, the harder it began to feel. Data paralysis and dysfunction would plague organizations. The promise was eternally far out of reach from the reality. And, the moment you got closer to the promise, there was a newer, more exciting tool that pushed the goal line a hundred yards further away. That thing that felt so exciting in the late 1990s began to feel like a massive pain by the late 2010s. That feeling we had when we woke up? There’s a name for it: It’s called “The Analytics Hangover.”
For the Love of Stats
Every week, my father sends me a text bemoaning the current state of advanced analytics in major league baseball. He reminds me about the time when Warren Spahn threw a fourteen inning complete game shutout or when Bob Gibson started three times with only a day’s rest. He wonders why baseball can’t be like that still and he attempts to convince me that analytics have ruined baseball. To my credit, I never bite. It’s not because I don’t believe in the analytics, but rather because I secretly love how attached my father is to his version of baseball from the 1960s. As it happens, I am obsessed with analytics. I’ve attended the Sloan Sports Conference at MIT as a wide-eyed fan. I literally spend hours every month playing with neutralized and adjusted baseball statistics in a feeble effort to compare the players of the 1980s to those of the 2000s and the 1920s. I love stats.
Further, and more apropos of digital analytics, I have always carried a natural bias towards the researchers and analysts on my team. I spend a disproportionate amount of time with them. I lean on them for decision support. I defend them from overburden and unnecessary work. My last start-up was bursting with researchers, analysts and optimization practitioners and, to this day, I celebrate the work that they do. Similarly, I don’t implicitly blame the proliferation of analytics software tools, which (a) are designed to be “services” and not “solutions” and (b) are generally positive disruptions in our desire for knowledge and efficiency.
The Real Enemy
But, SaaS never runs itself. And, in nearly every instance, the practitioners (researcher, analyst, designer, product manager, marketer, optimizer) are working within a business culture and system. All too often, business cultures tend to value “The What” at the expense of the “The Why.” That is where I am pointing my disapproving glare. I resist and resent the extent to which business systems and genetic predispositions reward confirmation biases and cognitive closure. This hard wiring, which leaders and their teams are frequently beholden to, are the enemies of research, analysis and the pursuit of learning.
To be clear, outcomes matter. Some would say, that is all that matters. Employing facts and evidence to understand “what happened” with an experience, a product or a campaign is critical. You want to know if something succeeded or failed and you want to be able to celebrate accomplishments. No argument there. However, as I understand it, outcomes are wholly correlated to the size and type of obstacle being addressed and the effectiveness of the solution in overcoming those obstacles. Products, designs, services and experiments succeed to the extent that they remove obstacles and solve problems. Period. So, while the outcome (“The What”) is wonderful, it has less intrinsic value than the understanding of “The Why.” “The Why” helps explain motivation, desire and propensity. It enlivens our curiosity and tunes our hypotheses. When we understand why something succeeds or fails, there is an annuity that extends far beyond “The What.” Companies and teams that devote more time to “The Why” are simply more likely to achieve longer term, more sustained, positive outcomes — their “Whats” look a lot better.
In business, as in life, there are many proclivities and traps that work in conflict with deep and honest understanding of “The Why.” Previously, I mentioned the individual need for cognitive closure or the propensity towards confirmation biases. Those are real and the most prevalent examples. But there are so many more. There is the Einstellung Effect, which is the development of mechanized thinking in relation to problem patterns that someone or something has encountered before. The Einstellung effect plagues most people and organizations I know and is a leading cause of failed design and failed experiments. It is the condition that causes us to want to use the same solution this time because it worked on something familiar or similar the last time. It’s a hack of efficiency that obscures change, innovation and motive (“The Why”).
The Traps
Beyond our human conditions, however, there are entrenched business practices that are practically ubiquitous, but rarely effective. The most common failure I observe is the organization who values project and task completion over customer needs and insights. This is the company that is all about “The What” and rarely considers “The Why.” If those businesses would simply shift a meaningful percentage or resources and time back to problem understanding — taking the time to ask why something seems true or false or good or bad — their solutions would be faster and more effective. When the skills of identifying and storytelling “The Why” are as celebrated as banal reports and empty post-mortems, the company quickly shifts to a culture of curiosity and learning.
I could belabor this point for days (months). So, let me instead simply present ten all too common business behaviors that memorialize “The What” and devalue “The Why”:
- Reports with no questions, answers and actions attached
- Reports on short term, retrospective performance, devoid of broader insight
- Reports “weaponized” to win debates rather than to understand the answer
- Small research sets used as (CYA) insurance or to confirm biases
- Experiments that test solutions without an articulation of the problem
- Experiments that test other people’s solutions based on other people’s outcomes
- Experiments that run on far too long, purely in pursuit of statistical difference
- Experiments that run despite insufficient traffic (data)
- Organizational design that silos departments at the expense of shared problem solving
- Organizational design that employs researchers and analysts as “report help desks”
An Antidote
Previously, I’ve made versions of this list that were dozens of rows longer. But, I’ll stop here, because, if you’re still reading this, you likely understand the problems even better than I do. I make lists like this not because I want to torture myself or taunt Fortune 500 companies, but because I am genuinely interested in the problems and solutions. To this end, I’ve recently joined the board of WEVO because they were the very first SaaS company I encountered who were working in that massive space between pure play, lower volume usability research and A/B testing.
Directly, or implicitly, WEVO addresses all ten of the sins I listed above by injecting more evidence and clarity into “The Why” of experience, product, design and communication. Their solution is faster to a confident “Why” than traditional A/B testing tools and methods. Moreover, WEVO provides greater fidelity and confidence than any usability testing tool I’m aware of, and with a fraction of labor pains. I’m not suggesting that WEVO is a replacement for your current tools (although it may well be depending on what you use, how you are staffed, how much traffic your site or app gets, etc.). But I am suggesting that, twenty plus years into the digital analytics revolution, we still haven’t addressed those obstacles I listed above and know all too well. I think of WEVO as a pretty revolutionary and remarkably intuitive step forward in solving those problems. It’s an antidote to the Analytics Hangover you may be feeling. It’s not as easy as a brisk walk, but it alleviates the pain just as effectively and is absolutely more likely to solve the root problems.