Differences between web and product analytics, why you need both, and useful tools to explore
In this week’s episode, join special guest Patrick Thompson, Co-Founder and CEO of Iteratively - a platform that makes it easy for companies to capture customer data -, and hosts Claudiu Murariu, CEO and Co-Founder of InnerTrends, and Arpit Choudhury, Founder of Data-led Academy, as they dive into understanding the differences between web analytics and product analytics, why we need them, how they can help your business, how to plan your collection data for analytics, and a number of web and product analytics tools for you to explore.
Welcome to The Data-led Professional Podcast: A podcast dedicated to helping folks become data-led to build better products and services.
Want to listen to the entire episode? Check it out here:
Subscribe on your favorite streaming platform for more episodes:
Select excerpts from the episode:
What are web analytics and product analytics, and how do they help?
[P]:Web analytics primarily comes down to what are some of the use cases that teams are looking to use web analytics for? And what are some of the use cases that product teams are looking to use product analytics for? I think a lot of it comes down to who the personas or users are within those organizations, and what are their jobs to be done when it comes to how they use this data?
On the web analytics side, you have marketers who are looking for things around attribution, and how they improve some of their channel optimization when it comes to advertising.
On the product analytics side, you have folks like product managers (PMs) who are trying to create better experiences for their customers and better understanding of what customers are doing within the products, and trying to tie this information with qualitative feedback to help with roadmap prioritization.
And on the data science and analysts side, you have folks who are building real-time machine learning models or personalization engines with this data, specifically in product or on the web.
There's this dichotomy that we have in the industry based on the toolset that exists, where you have web analytics tools on one side, and you have product analytics tools on the other. But in reality, a lot of teams think of these tools as interchangeable. It really comes down to: What data are they capturing? And what are the use cases for that data?
Obviously, you have session-based recording, you have things like being able to understand customer behavior and building better experiences. And you also have an understanding of who your customers are from thermographic or demographic information, which helps you create better personas and profile your customers. But it really comes down to use cases, and who the consumers of the data are internally.
What is the difference between the data that is consumed in a web analytics tool versus the data that's consumed in a product analytics tool?
[P]: When we think about the data that's consumed by web analytics tools, it's typically things like sessions, as a lot of it is anonymized data. You're looking at web analytics for de-identified users or anonymous users.
On the product analytics side, it's primarily identified data, so you understand who the user is. There's some identifier that you're using to track that particular user's behavior across sessions.
You're also looking at different sources of data. For product analytics, you might be tracking events on the back-end, your iOS app, or your Android app. It could even be on an IoT device, such as a smartwatch or a sensor. For web analytics, it's the website, as the name suggests.
With product analytics, there's a lot of information depending on the type of business you are- whether you're a B2C company or a B2B company. If you're a B2B company, you might want to track things not only at the user level, but at the account level, because you want to understand how a particular account is performing.
That's super helpful when you think about SaaS companies who want to better understand things like customer retention or churn, and then be able to build scorecards based off of the product analytics, or the customer behavior that's happening within the account. When you think about the sources of data that they use for product analytics tools, it's more broad than what you typically get from a web analytics tool.
How should companies use web and product analytics? How should one combine the two to get a clear picture of how users move down the funnel, from the time they land on the website, to becoming a user of the product, and then becoming an active and paying user?
[P]: If you think about something like the Pirate Metrics- it's very clear that customer acquisition is typically where marketing spends a lot of their time focusing their efforts and trying to better understand their particular channels and how to optimize those channels. And then you get into activation, where you have a growth team or product team takeover and try to shepherd the customer throughout the entire onboarding experience for their product.
Those teams are pretty much disparate systems for most organizations. And I think that's something that should probably be fixed within the industry. Because a lot of the time you want to better understand, “How did this customer get here? How do I attribute which channels are the most active customers coming from?” That's a question that's really hard to answer today.
Why are those teams disparate systems for organizations? What are the technical challenges to fix that?
[P]:More often than not, every challenge stems from human behavior. So it’s often a people and organizational challenge on how things are structured.
You have the marketing team, and then you have the product team. So there's two silos there. The toolset that we choose tends to become siloed, unless you ultimately end up building a data team as a separate silo to own and head everything. And then you get additional problems coming from that. But at the end of the day, it comes down to how we define the roles and responsibilities for the people who own these particular areas within our funnel.
A lot of the problems that we see teams struggle with aren't necessarily tooling problems. The tools today are pretty good when it comes to being able to answer the most basic level questions that teams are trying to answer.
The problem becomes, are we asking the right questions of the data we are actually capturing? There are a lot of basic questions that teams don't understand how to get from their data or from the tools they have today. And then there are more advanced questions that require somebody with more expertise, like a data analyst or data scientist, to actually write code to define and normalize across all of our different data sets.
A tool like Amplitude or Mixpanel isn't going to tell you, for example: who are your top paying customers that are engaged? To answer that question, you need to combine your Stripe data with the data from your data warehouse.
How do you plan your data collection for analytics?
[P]: It starts with the questions that you want to get answers for. That should be how you define your analytics schema and your tracking plan.
It's a skill in itself to ask good questions. When we think about really strong PMs, you have PMs who range in the notion of technical ability, the ability to write SQL, or use self service tools like Amplitude or Mixpanel. But being able to ask a good question, and then be able to try to find the data that's actually needed, or define the data that you need in order to answer that question, is a skill in itself.
And that's where data modeling comes into play. The more experience you have with that, the better you're going to get at ensuring that the data that you might need is actually implemented properly for you to answer those questions.
So it starts with asking good questions, and then being able to build a tracking plan or a data dictionary. A tracking plan is a schema of the things you want to track for your product analytics or your web analytics. Then getting those instrumented by developers and having a codified way of doing that is very helpful in ensuring the quality of the data you're capturing, which is what we’re specifically working on at Iteratively.
And then creating that contract across your PM, your developer, and your analysts so that everyone's on the same page about what they're capturing, why they're capturing this data, how this data is being used, and ensuring that there's a system in place. That way, as you iterate on your analytics and it evolves over time, you're not hurting the overall quality of your data.
Ultimately, data is one of the most important assets that a company has. So ensuring the integrity of it, and how that data is actually going to be used is super critical. It all starts with asking good questions.
Do you have any advice on how to ask better questions?
[P]: Last week we had a workshop specifically around good questioning. This is a struggle: “How do I ask good questions beyond the low level questions that folks typically ask of their data?” to understand “What does success look like for us as a business? If we're thinking about the future, where are we six months from now? And how do we get there? What are some of the hypotheses and assumptions that we're making for that future to become a reality?”
It’s very important to be able to dig deep in order to do that storytelling and be able to use your imagination to uncover: “These are the questions or the assumptions that we have in order to make that a reality; how do we validate that these assumptions are correct? And how do we make sure that, in order for us to measure the expected outcome of the work that we're doing, we had the right data in place in order to do that?”
We talk a lot about the way that growth teams operate. Growth teams are inquisitive by nature, and they are also very much data-led. They do things like experimentation, they measure, they have pre-formed hypotheses. They're not looking to validate that they're right; they're trying to disprove their hypothesis. They have a null hypothesis in place. And that's the way that all high performing teams should work, but it's not the case today.
When they think through the way that they work, it starts with defining success, and then it starts with: “What are the questions I need to ask in order for that reality to become true? This is a much higher level way of thinking that we're talking about now beyond what tools they're using, or what schema that they're defining.
That's a skill. Not everybody has that skill, but it’s definitely a skill that you can learn.
That's where I would challenge people: spend a lot of time honing that skill. That's going to be the most valuable in your career, to better understand how to ask good questions, and then be better able to correlate the work that you’re doing to the success of the business. If you can do that, you're going to be able to demonstrate that “these things that we've done over the last few months have actually driven business results,” or, “these are the learnings that we've had regardless.” And then that will inform that rapid iteration and improvement loop that most companies are looking for that just doesn't exist today in a lot of teams.
Our key take-aways:
As in many cases, it’s not just about tools, it’s about people: who are the users of these tools, and what are their needs.
Data analytics needs to be linked and come together to make sense.
To be a data-led professional and build a data-led company, you need to learn to ask the right questions.