This essay, I’ll admit, is almost too broad to tackle.
To be pedantic, what does “analytical” even mean, and how can one determine how “analytical” a mind is?
The way we speak about analytics and data lacks precision. Some mean sending a weekly report with time series charts to stakeholders; some mean exploratory data analysis. And plenty of people are leveraging product data to build machine learning models.
For the sake of this essay, I’ll simplify it. What I mean is: collecting, analyzing, and using available information to improve decision making and business outcomes.
In that way, “analytics” doesn’t need to be isolated to traffic source reports in GA4, but could include qualitative insights, revenue and financial data, market data and more.
There are two skills I’ve deeply invested in throughout my career – writing and data – and to me, they have a lot in common.
There’s the surface level best practices – grammar, form, legibility. But to me, the teleology of both skills is the same: to exhibit clear thinking.
There’s levels to this, or why I hate the word “reporting”
I never liked the term “reporting.”
In practice, it usually means copying and pasting an automated report or dashboard to show your boss whether the numbers are going up or down.
If they’re going up, you celebrate and claim credit. If they’re going down, you go on a hunt for causality and usually end up blaming something external, like “seasonality.”
In this world, analytics is used for proving your value, which is not useless entirely, but is merely the tip of the iceberg.
“If you are unique, why should you crack open a standard analytics tool with its standard reports and metrics and get going?
Or why simply respond to a “report request” and start data puking? The person at the other end is probably uninformed about Analytics and Segmentation and what is possible (even as they are supremely qualified to do their job in Marketing / Sales / HR).
You need business questions because:
- Rather than being told what metrics or dimensions to deliver you want business context: What’s driving the request for that data? What is the answer the requestor is looking for? Then you apply smarts because you have context.
- Best practices are highly overrated. If this is your first day on the job, sure go ahead and puke out what “industry experts” recommend. But know that it won’t impress anyone because you don’t actually know what the business is doing / cares about / is prioritizing.”
If you’re operating with little to no understanding of analytics and performance, reporting is a fine place to start. And it’s very simple. Determine what metric you own and ask an analyst to help make you an automated Google Sheets report or Looker Studio chart that tracks this number over time.
We can spend an hour there, automate it, and then move on to the more interesting stuff.
Leaving aside technical skills like Python, R, or SQL, the layers that I view when training analytics skills are as follows:
- Reporting and communicating results
- Understanding the underlying data model and context
- Being able to unearth insights that result in better decisions and actions
- Asking intelligent business questions to guide your use of analytics
(:”Does he ever tired of the Vince McMahon meme?” No.)
Essential data literacy and understanding context
Data is just a bunch of information that you collect organized in a certain format. It’s passive. There’s no inherent value in “data.”
We as humans make sense of data. How we organize the data enables us to make sense of the information in different ways.
For example, in Universal Analytics, data was collected according to “scopes.” It was hit-scoped, session-scoped, user-scoped, or ecommerce-scoped.
A pageview was a “hit.” So was an “event” or a “goal.”
So according to this data model, a pageview can not have a conversion rate, because conversions occur within a session. Therefore, to have understood how a page converted, you needed to know that you would have to view session-based reports and report on something like “landing page conversions.”
As Matt Gershoff from Conductrics once told me, quoting his college math professor, “How can you make cheese if you don’t know where milk comes from?!”
As SEOs, you have a lot of domain expertise that business stakeholders broadly don’t have. You understand Google Search Console metrics and how they are tracked, from clicks to impressions and position and click-through-rate.
I’d argue that we, as marketers, could all benefit from a great understanding of our specific telemetry. What constitutes a “bounce” that makes up the engagement metric “bounce rate.” How is an event logged and when does it count as a “conversion?” What is the variance in our data?
So after teaching basic reporting templates and building out organic growth dashboards, the next place I start when training analysts is in our particular platforms.
At Workato, where I ran experimentation, we had to learn about website analytics generally, but also A/B testing statistics and how our tool randomized and calculated metrics, and then we also had to plug into our B2B attribution modeling tools, namely Marketo / Bizible, because we were logging qualified leads as well as general conversions.
Beyond the platform specifics, it’s important to understand some basic data literacy topics, such as:
- Categorical data vs Numerical data
- Nominal vs ordinal data
- Discrete vs continuous variables
- Correlation vs causation
Or like, if you’re running A/B tests, you should probably know what a P-Value actually means and how we infer causality based on randomized controlled trials. Note: you don’t have to explain this to stakeholders, and probably shouldn’t bring out the nerdy stuff at the executive meeting, but it’s important to understand what the data you’re analyzing actually means.
A great way to learn this stuff, through trial by fire, is by learning SQL (or R or Python).
This is obviously a heavy lift from a learning perspective, but these tools force you to understand what data type and format you’re dealing with, because that will essentially limit what you can do with it and how you can merge or compare it with other data types.
Do, Watch Talk
Analytical thinking requires both conceptual knowledge as well as practical understanding.
That’s why I like this model for coaching: I do, you watch, we talk. You do, I watch, we talk.
Let’s say, for example, that a consultant at Omniscient is used to working with PLG SaaS companies who measure last click conversions based on freemium user signups.
Then, they take on a client who has a more complex, sales-led model. The best way to credit value to content and SEO is not last click user signups, but rather something like first click or linear attribution to lead conversions, and from there we need to tie together lead and CRM data to attribute revenue from our program.
This is a novel paradigm, so the first time we’d implement it, I would walk through how to set this attribution model up in our web analytics tool, how we’d work with their team to connect GA4 to Salesforce, and how to build a dashboard that reports on assisted pipeline.
But the next time, our consultant would take a crack at it, I’d watch, and we’d talk through any discrepancies or skill gaps.
In this way, you’re not just throwing someone to the deep end and saying “learn to swim,” but you’re also not absolving someone of the privilege of learning how to do this by practicing.
Practice plus theory
The way I like to learn is to have a project with some stakes, a bit of skin in the game.
That’s how I learned R. My manager, Peep, gave me 6 weeks to complete a project where I built data-driven personas using qualitative and quantitative behavioral data and clustered product user attributes using R.
In other words, you could read the AP Style guide, or you could try to write an article.
Like, I knew in abstract what COGS (cost of goods sold) and OpEx (operating expenses) were. But until I was actually running a business, they were merely concepts in my mind. Now I can see strategy and opportunity in the numbers when I read a P&L.
Statistics didn’t mean much to me, despite taking classes in college and running many t-tests. Then I built a career in A/B testing, and everything made so much more sense to me.
So yes, listen to the podcasts and read the articles I’ll recommend below. But also, pick a project that helps you develop analytical skills.
Try to implement a custom event in GA4. Turn it into a Key Event. Build a Looker studio dashboard that automates the reporting of this metric.
Instead of getting into LinkedIn debates about attribution, compare a few different models and see the differences in the credit they assign your marketing channels and campaigns.
Additionally, have a burgeoning analyst attempt to scope out and estimate their own project timelines. Having a sense of ownership, for some psychological reason, helps you learn things much faster and makes the lessons stick.
The ultimate aim: curiosity and asking better questions
Ultimately, a good analyst is like a good journalist. It’s all about asking better questions.
The question you ask in analytics dictates the landscape of possible answers.
Asking, “what’s a good bounce rate?” for example has several assumptions embedded within it – that bounce rate is important, that there is a “good bounce rate,” and that you should benchmark your bounce rate to this “good” bounce rate.
If you want to get more utility from data, then, as Rustin Cohle said on True Detective, “start asking the right f*cking questions.”
Sometimes the right question is as simple as “how is our organic traffic trending?” But more often, I find that the surface layer question is sitting on top of something more important – especially when a surface level question comes from a business stakeholder or client.
I’ll give you a specific one from our own GTM.
On a very simple level, we can ask, “how much revenue are we bringing in every month?” which we can back out one step and ask, “how many clients did we close this week / month?”
But if you think about it for longer than a few seconds, you realize that whatever the answer is, the action you take is “close more deals.” It doesn’t really matter if you’re ahead or behind of your goal.
But if we reframe the question and start to think more about the system that results in revenue and new client deals, we can ask, “what leading indicators predict the revenue we attain or clients we close?”
Okay, so we can start to run regressions, or we can start with a hypothesis: pipeline predicts clients and revenue.
So how do we increase pipeline? Again, we can run regressions on historical data, or we can start to map out an acquisition channel matrix and calculate our current saturation versus the potential of the channel. We can plot ourselves on a custom built S Curve.
Then, instead of berating ourselves and saying “close more deals,” we can say, “SEO represents the best possible investment due to reasons A, B, and C. Because of this, we will take actions X, Y, and Z.”
Better questions bring better answers.
Data and epistemological humility
One paradoxical result of getting better at analytics is an increase in expressed ignorance.
Someone with little to no understanding of data may have undue confidence. They may read a headline in the news talking about bad science and take it at face value. They may see a correlation and run with it.
The more you learn about data, the more the word “uncertainty” comes up. You can reduce uncertainty, but you can never eliminate it entirely.
That’s why, in the project stage of some candidate interviews, we include a portion where they analyze a live data dashboard. The point is not to have answers. It’s to have good questions and self-awareness. Or really, it’s to identify the following traits:
- Curiosity
- Attention to detail
- Data literacy
What you’re looking for is an inclination to say, “huh, that’s interesting,” when you see something in a chart that sticks out. “We should research this further. I wonder if….”
Learning more about data also helps you understand when you shouldn’t rely on data to make decisions. In some, perhaps many, cases, you’re operating on low information and simply need to move forward.
It’s the mark of a great analyst and operator to know when that is the case.
Want more insights like this? Subscribe to our Field Notes.