It takes a village to build a product. Not literally, of course; but the sentiment is the same. You need many different people in many different roles in order to get a successful product off the ground. Without everyone’s individual skills and perspectives — their special piece of the puzzle — product decisions can easily fall prey to bias and missed opportunities.
HubSpot relies heavily on this village model for virtually all of our product teams. For each team, there’s a core group of people who do most of the heavy lifting when it comes to building their product. Engineers bring their deep understanding of constraints and technical tradeoffs. Designers use their deep knowledge of the user, her experience, and design standards across the product. And product managers contribute their broad and varied insights about the business. This core team is supported by a number of other people who are integral to a product’s success: UX researchers, UX writers, expert support reps, and, of course, product analysts.
Most product analysts exist in a world of rows and columns; functions and models. And while that’s certainly true at HubSpot, product analysts here don’t exist just to run the numbers. Here, we tell stories with data and influence outcomes. We are partners in the success or failure of a product. And we use our insights to drive product strategy.
Why? It’s simple: having highly collaborative product analysts who have some skin in the game makes for better products.
The Devil’s in the Data
Often, core product teams (the Product Manager, Designer, and Engineers) have a strong intuition about how customers are using their products and what those customers want us to build. And they should; they spend a huge amount of time listening to customers in research sessions, reading customer feedback, and synthesizing information from our support, sales, and services teams.
And very often, that intuition is based on solid data they’ve gleaned from the qualitative research they’ve done. But without a clear picture of the user base as a whole, it’s difficult to be totally confident in a decision after a mere handful of user research sessions or NPS responses alone. That’s where having an expert to delve into the data and back that intuition up with numbers can make a world of difference. At HubSpot, we’re steered by our instinct but driven by the data.
Telling a story with data is one of the most powerful tools at our disposal — users might say one thing about an app or feature during a UX research session or a survey, but the data shows us what they actually do. And while it can sometimes be tempting to write off the results of a user test due to certain variables or factors, behavioral data often presents a much clearer picture of how users actually act.
This data helps us evaluate our products in an unbiased way. User sentiment can be a tricky thing, but the numbers don’t lie.
This data is even more important if considering a decision that would change the course of our company, like a new product. Having data to back your idea up, and the ability to communicate that data effectively, can be the difference between making a decision confidently or not making it at all.
Asking all the right questions
When working with a team to figure out how data can better guide their decision-making, I start by teasing out a hypothesis based on the information the team already has about how users interact with the tool or what the team is trying to accomplish with their product. This hypothesis gives us a great starting place to figure out which metrics we should measure. It almost goes without saying, but picking good metrics is important; we want to use the data to either prove or disprove our hypothesis, and data that isn’t founded is useless.
We recently had a great example of well-chosen metrics that decidedly proved out a hypothesis. One of the teams I work with was looking to sunset an outdated tool in our product. They had a hunch that the tool just wasn’t providing as much value as it once had to our users, and did a few rounds of qualitative feedback in order to validate their instincts. But in testing, there emerged a small but very vocal group of users who desperately wanted the tool to stick around — vocal enough to give the team cold feet about sunsetting it. So we went back to basics. We took the hypothesis that our tool just wasn’t cutting it, and decided to do a deep dive into the data on how and when our customers used the tool.
The data was spot on. We found that the user base had been rapidly declining for a while, and dug up some solid behavioral evidence that the tool was no longer providing the value it once had. This understanding gave the team the confidence to remove a tool that otherwise would have stuck around because of a small (but influential) cohort of qualitative data.
Culture of trust
All these things — making hypotheses, presenting information, supporting teams — are table stakes for product analysts. And at many companies, that’s where it stops. Product analysts are service providers. They respond to requests and provide data. They might make a recommendation or two. And that’s it — their work is done. On to the next request. Back to the spreadsheets and SQL queries.
But product analysts at HubSpot operate under a different model. Each product analyst is assigned to a product family, meaning that they get to build a deep understanding of their area of the product, and develop very strong ties with the people who build it. Our culture of quick iteration and learning means that there is a baseline trust in the insights we provide, because teams understand that knowledge is power, and more data leads to better decisions.
This means that our teams don’t just trust their product analysts to come up with the answers — they trust us to figure out the questions we should be asking. We can only do this because we spend a lot of time working with teams, understanding users, and learning where the team is headed and why. By grasping the value proposition for each tool, we can help teams set achievable (yet ambitious) goals and measure their success. We’ve seen this time and again — at the end of the day, the best way to build a successful product is to create something, measure the heck out of it using both qualitative feedback and quantitative data, and then build something better.
In fact, sometimes asking the right questions allows us to build not just a better tool or feature, but a better experience entirely. This happened recently at HubSpot. When looking from a 10,000-foot view, things seemed great. Our qualitative feedback and NPS was showing a fairly happy, growing user base.
But we had some additional questions about how users in different circumstances felt about HubSpot. And as we were slicing the data into groups of users that entered the product at different times and used it in different ways, we uncovered a pocket of really unhappy users. If a user’s first actions in the product took place more than 30 days after the account was created, they typically used the product less, and had a much lower NPS. This made sense — these were users who hadn’t experienced the in-depth onboarding process that the initial users at their company had gotten. This discovery spurred a huge, company-wide initiative to improve the experience for all new users — data that substantially shifted the focus of our whole organization.
Without data, teams miss an essential part of the picture. By providing a unique perspective into how people are using the platform, we product analysts have a lot of opportunities to drive product strategy where the data guides us, and a lot of ability to influence product decisions based on facts. And that amount of influence means that we have not just the ability, but the responsibility to guide product decisions with data.
If this sounds like a team you’d like working on, please get in touch. We’re looking for Product Analysts and a Head of Product Analytics who are passionate about the numbers and about working with teams to build the best products they can.