When I started building digital marketing reports in 2005, I was given what felt like the equivalent of a master’s degree in Microsoft Excel. The person who taught me was a spreadsheet wizard. Within six months, I was already working with large data sets from one of the biggest travel websites at the time.
I will never forget my frustration with Excel’s row limit of 65,536. It makes me laugh to think of the crazy workarounds we came up with to work with and analyze large data sets prior to Office 2007. In fact, I’d go out on a limb and say I was working with “big data” before it had a name (or a hashtag).
Over the next decade plus, I would dedicate over half of my time to answering data requests from clients. The concept of the data-driven marketer was taking over, and there was no going back. I always thought all marketers were data-driven by definition, but digital marketing took that concept to a whole new level.
‘Let Me Get Back To You On That’
“Digging into the data” became a mandatory part of my daily routine, primarily because I worked with enterprise level-websites. I can’t even tell you how many times I have uttered that phrase over the last umpteen years. There is something about large websites that makes it astronomically tougher to diagnose data fluctuations, which is why nearly every data platform attempts to make that process easier and more efficient.
I initially trusted the data. I’m talking about a Truth-with-a-capital-T level of trust. I was young and naive. I did not yet know about the limitations of tracking software, and it’s not like anyone at those companies was going to tell me about their software’s blind spots. Within eighteen months of working as an SEO, I noticed a rising level of distrust within myself about the data and reporting I was creating.
I was enamored by Omniture, Webtrends, Coremetrics, Google Analytics, and any other third party tracking software that let me pull data. Everyone else was, too. However, there was something in the data that just didn’t add up. As a former mathematics teacher–who loves when things add up–I started questioning the data in my clients’ reports, especially when I had evidence.
Losing My Religion
Data is my True North. Metrics are my compass. These things lay the foundation of everything that I do for my clients, so you can probably imagine what it might be like for a person like me when I start to see cracks in that foundation.
While it started slowly, every few months I would see something in my reporting that just didn’t sit right with me. Fast forward to 2014, and I was increasingly skeptical with every report that I saw because I was not sure if the underlying data was 100% accurate. This is when a case study from Groupon confirmed all of my suspicions.
Groupon ran a test where they purposefully de-indexed their website for six hours. When you block search engines from your website for six hours, you expect to see your organic search sessions drop to zero during that timeframe. That is what Groupon saw during those six hours. However, Groupon also saw a 60% decline in “Direct” sessions during that exact timeframe. When the six hour test was over, the “Direct” sessions recovered to their typical levels.
“Our testing shows that, for a site getting in the ballpark of 50% mobile web traffic, the 60% of the traffic to long URLs reported as Direct is probably Organic traffic from Google.“Source: Gene McKenna, SearchEngineLand.com (July 8, 2014)
When I read that sentence for the first time, I swear a part of me died. I had been creating SEO strategies for years based on data that was wrong. Furthermore, my clients and I were not able to see all of the organic search traffic we were driving, as most of it was being tracked as “Direct.” Also, I was in the midst of a months-long quest to find which browsers and operating systems were hiding referrers or perhaps not sending them at all. Now, I realized it was all of the browsers and operating systems–and they all did it differently. There was no way to know!
Regardless, my innate optimism looked ahead to the future, thinking that surely all of these massive companies will figure out how to track things more accurately. I was wrong.
If You Think Fake News Is The Only Fake Thing On The Internet, Think Again.
In a recent article in New York Magazine, author Max Read provides a fascinating look at the world of website metrics. He essentially describes an internet where everything from web metrics to content to businesses to people are fake. For example:
“Take something as seemingly simple as how we measure web traffic. Metrics should be the most real thing on the internet: They are countable, trackable, and verifiable, and their existence undergirds the advertising business that drives our biggest social and search platforms. Yet not even Facebook, the world’s greatest data–gathering organization, seems able to produce genuine figures.”Source: Max Read, NYMag.com (Dec. 26, 2018)
This article generated an eye-opening response from former Reddit CEO, Ellen K. Pao:
It’s all true: Everything is fake. Also mobile user counts are fake. No one has figured out how to count logged-out mobile users, as I learned at reddit. Every time someone switches cell towers, it looks like another user and inflates company user metrics https://t.co/tk1PKuvLL6— Ellen K. Pao (@ekp) December 27, 2018
Data validation is something that greatly concerns me. I have worked with dozens of companies that have the most modern and advanced analytics software enabled on their websites. Some companies spend millions of dollars for each years for analytics software. And let me tell you – modern analytics software is highly sophisticated. It’s wildly impressive. However, if I ask to speak to the client-side person who responsible for managing, testing, and monitoring the tracking software on their websites, I often hear crickets.
Here’s a quick challenge: Find someone who knows the complex intricacies of exactly how visitors are tracked on the web, how KPIs are defined in each tracking software, which visitors are most likely to not be tracked or those who get incorrectly dropped in the “Direct” bucket, and how to find and isolate those visitors in each tracking software. Go ahead. Find that person. [Hint: This person is not only a unicorn. This person is a mythical, cycloptic unicorn that breathes fire while doing the latest viral dance from Fortnite.]
Over the last few years, I have started to see more and more companies hire in-house analytics managers, as more and more websites are beginning to truly focus on their data’s accuracy. After all, because data is the foundation of nearly all digital strategies, all steps should be taken to ensure that it is correct.
But why did it take so long to get to this point? I think I have some pertinent insight into that question.
Nobody Wants To Know That Their Data Is Wrong
I am not a quick learner. I’m even slower when it comes to interpersonal social skills. Case in point: It took me at least five years to figure out that people don’t want to know about the quality of their data. Wait. That can’t be true, can it? Maybe it is just my personal experience. But I have a hunch that many digital marketers have come to the exact same conclusion based on real-life situations.
I can’t tell you how many times I ruined an otherwise perfectly good meeting or phone call by bringing up evidence of problems in my clients’ data. In fact, early in my career after I brought up data discrepancies in a meeting, a chief officer sat me down and said: “Don’t bring problems to meetings. Bring solutions. Don’t do that again.”
To this day I have an internal struggle when I discover data discrepancies. Should I bring it up? If so, when? To whom? Is it most effective to do in a meeting or on a phone call? Everything inside me tells me that I should bring it up in an environment where everyone can be informed about it, but I have learned that is hardly ever the best option.
‘Winning solves everything.’ – Tiger Woods
Here’s another nugget I noticed very early in my marketing career: The only time anyone will listen to a data validity issue is when their numbers are down. If the numbers are up, everything must be fine, right? I can’t remember ever being asked about potential data problems when the numbers are trending the right way.
I see this scenario play out in sports quite often. During post-game press conferences, reporters hardly ever throw hardball questions at the winning coach. Only the losing coach gets those type of questions. Because clearly the winning team has no issues.
Sports analogies are fun, but let’s bring this back to our world. You walk in to an important meeting to review and recap a three-month project that involved several teams and a wild new strategy. The results are great. The numbers are amazing. Everyone is happy. The mood is congratulatory. And then you raise your hand and ruin everything.
If there was ever a gif that best described me–there it is.
I am Josh’s complete lack of situational social skills.
This classic scene is, of course, from the Tom Hanks movie, Big, where 12-year-old Josh Baskin wakes up in the body of a 30-year-old man and is forced to navigate the real world as kid trapped in an adult’s body. I can relate to Josh Baskin in so many ways.
Let’s Wrap This Up Already
Today, I know that no one has 100% accurate data. Nothing is perfect. We just have to roll with the data we’ve got. And if we can get better data, let’s do our best to make that happen. Additionally, I have learned that it is (probably) best to not ruin the mood of an otherwise congenial meeting by addressing an issue that brings up more questions than answers and/or brings into question the success of the project. There is something powerful about a team accomplishing a goal and getting together to high five about the amazing results.
Whenever I have questions about anything these days, I do my best to avoid bringing it up in project review meetings. I usually find another time to approach the person who might be able to help me verify the issue before I bring it up to anyone else. I still believe there are times where these things need to be addressed, but I think a lot more about when and where and how to do it.
tl;dr I don’t 100% trust any of the data I see on a daily basis, and I have learned to be very careful about how and when I bring up data issues with clients.
Tell Me About Your Experiences
I’d love to hear your stories about working with data and people, so please leave a comment below – even if it is just to reminisce about the 1988 blockbuster, Big, starring Tom Hanks.