Diego-Sanchez-conversation-cover

We often think of product analytics in the context of making existing products better. Want to find out whether moving a button a few pixels to the left will lead to an increase or drop off in conversion rate? Move the button, set up a tracking event on it, and soon you’ll have the data to find out.

But product analytics can also influence and validate product decisions when building something completely new. By combining customer interviews and qualitative feedback with insights from product analytics, you don’t just learn what your users think of your new feature—you learn how they’re actually using it. And as Diego Sanchez from social media SaaS company, Buffer, points out, this is not always the same thing.

As a product manager who’s been leading the development of Buffer’s new Engage suite of features, Diego has experienced the benefits of using product analytics during product development first hand. We spoke to Diego for some takeaways on how product analytics help his team build successful new features to expand Buffer’s toolset for social media marketers.

1. Hypotheses are everything—so take your time

Like any type of usability testing or data science, writing an airtight hypothesis is vital to getting valuable insights from product analytics. So before you start thinking about what user behavior you want to track, Diego recommends spending a decent amount of time thinking about the problem you want to solve—and how to phrase it:

A common pitfall with product analytics is getting your hypothesis wrong. I would spend the majority of my time thinking about the question I want to ask. Once you have a solid hypothesis defining a tracking plan should be simple.

Diego Sanchez, Product Manager at Buffer

How you approach thinking about your hypothesis will be different for a new digital product or feature than for an existing one. You might have to pivot quite hard after some initial feedback. For Buffer’s new Engage features, Diego’s team started by defining broad hypotheses based on their fundamental assumptions. “Our primary hypothesis was: 'Do customers need an engagement solution when using a suite of social media products?' Then after that: 'Will including an engagement solution increase the value that we're providing for our customers?'," says Diego.

The latter hypothesis could only be proven after launch with solid quantitative data. But it gave Diego and the team a simple metric to aim for with their Engage features: increasing user retention. And after coming up with broad initial hypotheses, they were able to break each down into smaller, more detailed assumptions to answer each question with users’ help.

2. Talk to users to guide your product analytics

Using product analytics to effectively track user behavior starts with getting to know your users. This is important for any product analytics, as you need to understand your users’ goals to know which events to track and optimize. But it’s especially vital for product teams developing new features, which always means taking more risks and considering new use cases.

Even if your new feature ideas have come from looking at product analytics data—a good source of inspiration—you should validate these ideas by speaking to users directly. “Early in the development process, we focused on building a very close relationship with a group of users. By forming an understanding of our customers’ needs, we defined three further core assumptions for building Engage,” adds Diego.

These assumptions were:

  • To boost their engagement on social media, our customers want to respond to everything.
  • Our customers want to get to what’s important first (e.g., angry or negative comments).
  • Being able to engage at speed is important to our customers.

With these assumptions defined, Diego and the team continued to test their prototypes and iterate with the same group of users. This helped them figure out whether their initial product ideas were solving their customers’ needs, which guided their decision-making.

“At this point, we relied on interviews. We weren’t collecting a significant amount of product data. So instead, we kept talking to users and gradually rolling out our designs to more customers,” explains Diego.

The team asked questions like: ‘How would you describe the problem this solves for your company in your own words?’ Then they waited to see if customers mentioned any of their core assumptions. The responses were good— they were on the right path.

With this initial qualitative data, the team was able to move fast early on. But next, they had to validate Engage with some numbers.

Build a product shaped by your users

Get fast, actionable feedback by testing with Maze. Learn what your users’ problems and needs are and validate your ideas before starting development.

3. Track early and with precision for maximum insights

You might have to wait a while until you have enough data from product analytics to draw solid conclusions. However, Diego thinks it’s never too early to set them up with whatever product analytics platform you use. “Even if the data isn’t useful yet because less than 60 people are using your new feature, it’s still a good idea to start collecting as early as possible,” he explains.

The team started tracking user actions as soon as they had their hypotheses. This means they could always get a rough, real-time idea of whether the feature was working or not, even without a lot of data. Diego points out another benefit for tracking early:

Setting up product analytics from the start allows you to continuously improve the tracking process. You might forget to track a certain thing, then realize there’s still a secondary hypothesis you want to measure.

Diego Sanchez, Product Manager at Buffer

He experienced this first hand with a small feature in the Engage suite: the emoji picker. Based on customer interviews, the team built a feature that automatically suggests emoji responses based on what your brand often uses in its messaging. The team initially viewed it as a nice, but minor feature.

But when they checked product analytics, they saw it was hugely popular. The emoji picker wasn’t related to any of their core hypotheses, but having good tracking data led them to discover that it’s a key feature for their customers. It saves them a lot of time, and it’s really intuitive.

However, while it’s smart to track events related to secondary hypotheses, you should avoid tracking everything. According to Diego, precision pays off. “Collecting too much product analytics data is confusing. If you have a giant dashboard with many data sources, that might show that your question isn’t quite clear. You don’t need to track every click—just a clear primary and secondary metric is often enough,” says Diego.

For Engage, the team tracked the Activation Rate to measure their primary hypothesis about increasing retention (or decreasing churn). Then for secondary hypotheses related to specific features, they picked one metric for each to measure how people were using them. Here’s a few examples:

  • Number of customers replying to more than 10 conversations in a single session with Engage
  • Number of customers using hotkeys to navigate the app
  • Number of customers using Engage’s new automatically added labels to decide which post to view next

4. Close the customer-product feedback loop

If you set up product analytics to collect data from the beginning of the product development process, you’ll naturally reach a point where your data becomes more significant. The longer you’ve been tracking and the more users you have trying your new feature, the more useful your data becomes. Once Diego’s team had rolled out a beta version of Engage to a larger segment of their user base, they found themselves checking product analytics more often.

“When you have all the tracking in place, at some point there’s a switch. We started relying more on the usage data, and a little less on interviews,” mentions Diego.

Making the transition to validating your initial hypotheses with product analytics data is a vital step to close the loop of customer feedback and behavior—which can be inconsistent.

Your customers might be saying: “Yeah, we love it!” But then you check product analytics and they’re not actually using the feature, or they used it once and didn’t come back.

Diego Sanchez, Product Manager at Buffer

With the Engage feature, the Buffer team was pretty confident in their hypotheses after conducting extensive customer interviews within their target demographics. But the data also showed that they’d executed the user experience correctly—something they couldn’t be sure of without product analytics. “Maybe you have the right hypothesis and understanding of your customers, but what you’ve built is hard to use or not intuitive. Product analytics lets you close the loop by showing whether your users can actually use the solution you built,” adds Diego.

Product analytics lets you close the loop by showing whether your users can actually use the solution you built.

Diego Sanchez, Product Manager at Buffer

With Engage being rolled out to more and more users, Diego’s team started to track other KPIs like ‘Weekly Active Users’ and ‘Week on Week Growth’ to see if people were really getting value from the new features.

But even if the numbers look good, Diego still likes to interview customers occasionally at the end of the process to validate their insights from product analytics. This helps the team make sure they haven’t missed any opportunities in their analysis.

5. Prepare for the data to tell a story you don’t expect

When Diego found out he’d be leading the development of a brand feature, he was excited. But he was also ready to experience some failures along the way. “I was prepared for us to be way off with our assumptions. And with new products, that’s OK. But we got lucky with Engage—we haven’t had to pivot much,” says Diego.

Diego emphasizes that this isn’t always the case. Often product analytics can give you a very different picture from what your users are telling you. Having good data lets you ask good questions about your product, which can lead to new discoveries later in the product development process. It can highlight aspects of your product you didn’t think were important—like the emoji picker.

How did the Engage team avoid major surprises from product analytics data later down the line? Diego says it was the customer interviews that set them on course for success: “We had really good interviews with some customers. They really helped us shape our assumptions that we built Engage on. The fact that our analytics data validated our decisions was the result of that previous work.”

Product analytics are most effective when combined with qualitative research, usability testing, and a set of robust hypotheses. With every action from onboarding to unsubscribing easily trackable these days, it can be tempting to build a picture of your users purely through observing all user interactions across the customer journey.

But if you build product analytics into a development loop of user feedback and iterative testing, you’re going to unlock more value for your customers—which, as Diego says, “is ultimately what product management is all about.”

Stay in the loop

Sign up for our monthly newsletter to get notified of new resources on research and testing.