Ross’s tweet above is great, and it illustrates something I’ve been thinking about the last few years. That, to a point, there’s an advantage to being under-informed and less swayed by the day-to-day trends and news.
That’s the TLDR of this essay: consuming too much news, data, or information isn’t just a waste of time, but it can be counterproductive.
Whether you’re analyzing each of Google’s official and unofficial communications, logging into LinkedIn every day drinking from the Thought Leader firehose, or driving your marketing investment decisions based on podcast hot takes, you have to imagine that a large portion of this information is either useless and a waste of time, or worse, harmful.
Pundits, Predictions & Last Week’s Newspapers
Remember Balaji?
He predicted many things, some of which include Bitcoin hitting $1 million within 90 days of his bet as well as a bet that we, as a population, would stop going to baseball games and concerts and would stop hugging each other forever after the pandemic:
Pandemic has certainly changed some things, but I live in New York City, and crowds seem pretty unperturbed. Still hugging, handshaking, and generally being human in meatspace.
So we have someone many people look to as an oracle, who has, conversely, been very wrong in very big ways. He got many predictions right, too, but that’s not the point.
The point is there are many speculators, pundits, and talking heads – the ensemble of which will produce many correct forecasts, much like a thousand investments in pre-seed startups will surely produce a few winners – and the value of listening to them too much is dubious.
Further, many of the loudest voices with the biggest audiences are trained primarily in communication, not the technical knowledge of the subject they’re talking about. Balaji is an exception and does have technical credentials. Most of the people opining about the future of SEO and generative AI don’t.
In short, no one can predict the future with full accuracy, even though predictions make for effective and often viral content.
Social media algorithms favor the most salacious takes, and nuance is rarely rewarded.
Newsworthy things do happen, but not as frequently as our media environment and attention economy demands.
Or as Nassim Taleb said in Fooled by Randomness:
“It takes a huge investment in introspection to learn that the thirty or more hours spent “studying” the news last month neither had any predictive ability during your activities of that month nor did it impact your current knowledge of the world.”
Mo’ Data Mo’ Problems
News is information, and information is data.
I’m a data guy.
Used to run the experimentation team at Workato. Learned SQL, R, built some classifier models back in the day. I’m still more comfortable in Google Analytics and spreadsheets than anywhere else.
I’ll tell you what, though: more data can often introduce more problems than it solves.
Much of that has to do with data quality, but some of the problems simply come from our drowning in an abundance of information, which causes us to miss the important stuff.
This Nassim Taleb quote (this one from Antifragile) speaks to that directly:
“More data – such as paying attention to the eye colors of the people around when crossing the street – can make you miss the big truck. When you cross the street, you remove data, anything but the essential threat.”
You can measure bounce rate, exit rate, time on page, impressions, pages per session, sessions per device time, ad infinitum, but how is your work contributing to revenue?
Don’t Peek Too Often & The Danger in Dashboards
A/B testing tools let you see how each variant is performing in real time, so technically, you can peak at any time during the experiment.
While there’s some utility in this (detecting bugs early in the experiment), the more common result is one of the biggest mistakes made in experimentation.
At multiple points in a fixed time horizon experiment, your results may appear to be statistically significant. If you peek at one of these moments, you may be tempted to stop the test early and call the winner.
This, without getting into technical detail, increases your risk of a false positive, which negates the value of running an experiment in the first place.
There are ways to peek safely (sequential testing, bandit algorithms – but we won’t get technical here). But the main advice is this:
“Decide in advance how long your experiment will run and don’t act on any interim results.”
Similarly, the more segments or metrics you compare in an experiment, the higher the likelihood of a false positive (unless you correct for multiple comparisons).
Tracking more metrics, then, may not be as valuable as you think it is. You’ll likely find some illusory pattern among the large set of comparisons.
(Thank you Lizzie Eardley for these images and links).
Alternatively, imagine tracking your stock investments in real time, on a minute to minute basis. What you’re seeing isn’t performance (signal); it’s variance (noise).
How many times have you had an executive or a client point to a chart and say, “Why did the numbers go down?” Or in the positive, “wow, looks like we got a spike, what caused it?”
Sometimes, there are real learnings to uncover. In many, many cases, these just reflect statistical noise and variance, and they end up wasting an analyst’s time chasing down a “cause” to appease a stakeholder.
SEO, specifically, takes a long time to really cook. There’s a lot of variance when you look at day to day changes, especially in the early stages of building out a program. It’s only over the long arch of great strategy and execution that you can see patterns emerge (and this is usually only when you have enough data to warrant such analysis).
The danger of dashboards isn’t the tracking of KPIs over time; it’s the constant vigilance towards a trend line and the anxious compulsion to explain minor deviance as if you could root cause analyze a 5% delta day over day.
Why did Delta’s stock decrease yesterday? A journalist or a pundit could tell you, but only because they’re incentivized to have a “take” and get clicks. Someone with epistemic humility must, in most cases, say, “I don’t know why Delta’s stock decreased yesterday.”
What we need is less data but more precisely tuned to deliver information to questions we care to answer. This is much better than the standard “let’s track everything and see what the data tells us” approach.
Mixed Messages & Who to Trust
So we return to the state of marketing and the constantly evolving and often conflicting discourse.
The old playbook is dead, didn’t you hear?
SEO is dead; or it’s more important than ever. Links (or content or technical SEO or TOFU content or [insert SEO variable here] matter less now; or more than ever. We must build a media company; or not. Video is the future; or it’s a return to email newsletters.
AI is going to kill search; no it’s not. But actually, it will this time.
This algorithm update is good, it’s bad, it means X, Y, and Z. Maybe.
No wonder marketers are exhausted.
Part of this is timeless; marketers are great communicators, and vendors, consultants, and agencies talk their book.
Part of this is because the barrier to entry to getting the attention of an audience is lower. There are many “AI experts,” or rather “The AI Guy” LinkedIn people, who quickly pivoted from Web3 and get more attention than actual AI engineers.
And much of what you read IS true, at least in the proper context and with nuance and caveats.
But all of it is confusing, especially to those looking for good information with which to grow their companies.
How to Know Things
When you’re early in your career, it’s very difficult to parse out good information from bad. So unfortunately, I think you just have to consume a lot of it and try a lot of things.
Still, I believe that anyone, at any stage of their career, can benefit from going straight to the source of knowledge.
Instead of following pundits reacting to algorithm updates, learn about information retrieval and how search engines work. Learn about human behavior, how to track and make sense of data, how to write, and how to think.
There are now YouTube videos of someone reacting to reactions of Something. Instead of watching these, maybe just watch the Something and try to figure out how you feel about it.
And above all, learn how to do things. Real things that you can put into action, not just ideas. We all need inspiration, but hot takes and contrarian ideas are often just grist for the audience building mill. Learning how to build and read a growth model will give you 10X the information of your LinkedIn feed.
For leaders, and marketing veterans with more experience, think in first principles and trust yourself more. This isn’t to say “consume nothing, trust no one” — it’s a low-information diet, not a no-information diet — but pick your sources consciously based on track record and skin in the game, and always contextualize their advice in relation to your own situation.
Generally speaking, the more “expensive” it was to produce a piece of content, the more likely it is to be trustworthy. This could mean a peer reviewed paper or a book, but it could also just mean that a person has a massive amount of experience in a given subject and has proven their merit.
In thinking about the future, as Nassim Taleb says, “invest in preparedness, not prediction.”
We like the “Barbell Strategy” for this, which applies not just to content portfolio planning, but GTM planning more broadly.
In this way, you can keep one foot in the present, where you can tangibly see things are working and will likely continue to work, and you can put one foot in the future to expose yourself to potential upside without the risk of losing it all by concentrating your bets on something speculative.
The Long Game
I said on the DesignRush podcast that one of my mistakes / regrets was getting tilted during peak generative AI hype and questioning my own judgment. Granted, there was a lot of market instability among the companies we usually work with, and many people transparently told me they were waiting for the dust to settle to invest their marketing dollars.
Luckily, the dust did settle a bit, the market is now a bit more stable, and we stayed the course.
After all, we’ve structured our services and guiding philosophy around robustness, if not antifragility. Our stance has always been simply to drive attributable business results through content and organic channels for our clients, and because we never indexed on a speculative take (“links don’t matter,” “AI content is the future,” whatever), we’ve been able to maintain tactical flexibility within that mission.
I’ll leave you with one more Nassim Taleb quote if you’re still here and if you’re not tired of me leaning the intellectual weight of this essay on the Incerto:
“You may study for a year and learn nothing, then, unless you are disheartened by the empty results and give up, something will come to you in a flash. My partner Mark Spitznagel summarizes it as follows: imagine yourself practicing the piano every day for a long time, barely being able to perform ‘Chopsticks,’ then suddenly finding yourself capable of playing Rachmaninov. Owing to this nonlinearity, people cannot comprehend the nature of the rare event. This summarizes why there are routes to success that are nonrandom, but few, very few, people have the mental stamina to follow them. Those who go the extra mile are rewarded. In my profession, one may own a security that benefits from lower market prices, but may not react at all until some critical point. Most people give up before the rewards.”
Want more insights like this? Subscribe to our Field Notes.