Chapter 3

UX survey analysis: 5 steps to uncover insights and extract value

It’s the moment you’ve been waiting for: the results are in.

You’ve effectively written your UX survey questions, distributed the survey, and now you’ve got your feedback. All that’s left is to make sense of a vast amount of data and turn it into actionable insights. Easy, right?

In this chapter, we’ve covering how you can analyze UX survey data by identifying trends, sentiments, and insights—and what key steps you need to consider before and after the process.

Points to consider in your UX survey analysis

Before diving into your data, you’ll want to consider a couple of things to help you optimize your approach for efficient and effective analysis.

Revisiting your objective and project scope

At the beginning of designing your UX survey, you’ll have identified an overarching research plan, goal and project scope. When analyzing your data, you’ll need to revisit and remember these objectives you outlined at the start of the process. Doing so will help you spot relevant insights in your data and connect the strings together, bringing you closer to your goal.

Revisiting objectives will also help you prioritize the most relevant user issues. For example, if your goal is to increase clicks on your CTA button, then you’ll prioritize insights indicating users have trouble finding it, over insights on search bar functionality.

Qualitative vs. quantitative data analysis

The type of data you’ve collected is another key concern that greatly impacts your approach and overall analysis technique and wider UX reporting.

Qualitative data is verbose. It’s crucial for a deeper, more nuanced understanding of your users. Analyzing qualitative feedback from open-ended questions involves identifying key themes, concepts, narratives, and observations.

On the other hand, quantitative data is numerical. It’s ideal for precise, measurable results and hypothesis testing on your chosen UX survey subject. Analyzing this data includes comparing variables like usability metrics, and calculating averages, means, and percentages.

Use relevant tools for more efficient data analysis

As a UX researcher, you can opt to manually analyze both qualitative and quantitative data, but it’ll cost you time and effort that’s better spent actioning your insights. Using a UX data analysis solution can help automate much of the analysis process, speeding it up and freeing you up for other crucial tasks.

Ideally, you want to look for a flexible tool that offers all the data analysis solutions you need. If that same tool also offers complementary UX research methods that help with other aspects of the UX research process, like usability testing and card sorting, even better.

We’ll discuss UX survey tools in-depth later on in this guide. Check out chapter four if you’re unsure where to start, weighing up your UX solutions, or wondering if you made the right choice.

Uncover trends and sentiments in minutes

Gain insights, make user-driven design decisions, and launch your UX design forward with Maze’s intuitive UX survey solution.

How to analyze UX survey data in 5 steps

With the considerations above in mind, you’re ready to begin analyzing your UX survey responses. These steps will help you turn your raw survey data into decision-driving insights on your product.

1. Collect your UX survey data

Interpreting vast amounts of data—whether qualitative or quantitative—can be messy and complicated when incorrectly organized. If you’re using a UX survey tool like Maze, your data will already be consolidated on one interface, making survey feedback collection more efficient. A good UX survey tool will also enable you to analyze survey feedback within the platform, which saves a lot of time and effort (more on that in chapter four).

If you’re needing to manually analyze the data you’ve collected, the first step is pulling your results from the survey platform into an external repository, such as Excel or Google Sheets. Once you know what you’re working with, your data is ready for exploration, review, and analysis—putting you closer to those valuable insights.

2. Clean corrupted and unusable data

Not all data is good data.

Your response rate might be high, but that doesn’t guarantee all your data will contribute to your pursuit of active user insights. You’ll have to identify and weed out any corrupted or unusable data contaminating your repository:

  • Remove incomplete responses: Participants may provide partial responses, intelligible text, or leave certain questions unanswered due to fatigue or technical issues. Remove these responses from your data set for better quality.
  • Get rid of straightliners: Straightlining is a UX survey phenomenon most common with multiple-choice questions. If the same participant answered “B” on each of your ten multiple-choice questions, they most likely weren’t answering honestly but just trying to get through. Remove this data as it’s not genuine.
  • Lose the speed repliers: Take a look at the average time participants take to complete your survey. If, on average, the survey takes about 4.5 minutes to complete and you’ve got responses that took dramatically less time, remove them from your database.
  • Eliminate the outliers: Identifying outliers can be tricky. These are data points that deviate from the rest on a single subject. Sometimes, outliers can just be unique responses and honest feedback, but they can also be answers provided at random. Look for any contradicting answers within a single respondent—e.g. if a respondent answers an open-ended question noting they had no issues with usability but then rates usability as ‘very difficult to use’, you’re dealing with an outlier that should be removed.

Once you’ve made your data squeaky clean, it’s time to get stuck in.

You likely used different survey question types for your UX survey. This is the ideal technique to get insightful feedback, but it also means you’ll have qualitative and quantitative data to analyze.

Both are valuable, but both have a unique analysis approach and require their own time and focus to extract value. Let’s start with qualitative data.

3. Explore, code, and sort your qualitative data into themes

Explore your qualitative data by conducting thematic analysis to code it, and sort it into themes. This is when you’ll first begin to see the fruits of your UX research labor. Begin by reading through your responses to get a general overview of the feedback. While combing through, note any points of interest relevant to your project goals.

Assign indexes or ‘codes’ to these answers to identify points you’ll return to. A code shouldn’t interpret the responses but simply summarize relevant answer segments in as few words as possible. Here’s an example:

“I don’t like your smart suggestion feature all that much. Sometimes, it gives me rubbish responses, and the pop-up bubble follows my cursor, making it difficult to complete tasks.”

The first code could be: ‘doesn’t like smart suggestion feature’, while the other two bolded phrases could be coded as ‘smart suggestion gives bad responses’ and ‘smart suggestion makes task completion difficult’.

Once you’ve sifted through all your data and coded any relevant points, move on to sorting them into overarching categories or themes. All the codes above could be sorted into the ‘smart suggestion feature usability’ theme category, for example.

If you’re not using a tool that can code and thematize your data, we recommend going the pen-and-paper route with sticky notes, so you can easily group them and make changes when necessary.

4. Synthesizing data: from codes to themes to insights

So you’ve read through your data, coded it, and identified themes. Next, it’s time to convert these themes into meaningful insights for your UX team. Start with themes you consider a priority for your overarching goal.

A good characteristic of important themes is the frequency it's mentioned and the sentiment.

Gabriella Lopes
Product Designer at Maze

Share

If your main objective is to identify friction points in new feature usability, then the theme ‘smart suggestion feature usability’ outlined in the codes above is a great place to start.

Go through all your codes to identify every time users complained about the smart suggestion usability, and continue doing this until you’ve identified several themes in your data. Once you’ve got your themes, you can further interpret people's words to develop insights. Your themes can also be a great starting point for further user research.

Using our example, here’s a clear view of how we moved from codes to insights:

Codes Theme Insight
Don’t like your smart suggestion solution Smart suggestion feature usability Smart suggestion feature needs improvement as users find it doesn’t function as required
It provides bad answers
It makes it hard to complete tasks

💡 Want an in-depth look at thematic analysis? Check out our in-depth article on analyzing qualitative data with thematic analysis.

5. Analyzing quantitative data for measurable trends

Now we’ve covered the ins and outs of qualitative data, let’s move on to analyzing the numerical, quantitative data derived from closed-ended questions, multiple choice, and rating scales.

While quantitative data doesn't give you the same rich and nuanced insights as qualitative data, it’s equally valuable. Quantitative data analysis helps identify measurable trends and uncover key satisfaction scores like Net Promoter Score (NPS) or Customer Satisfaction Score (CSAT).

Quantitative data is great for identifying the what, and qualitative is great for identifying the why.

When dealing with a small data sample, use a spreadsheet to calculate important stats like averages and percentages. You can then identify user sentiment, usability, and customer satisfaction trends and correlations in your survey data.

For example, you may find that 70% of your respondents found your new website hard to navigate since you released a new feature. If 70% of your respondents also rated the feature itself as unhelpful, you can find a statistical correlation between those two insights.

Manual calculation can take a lot of time before meaningful trends emerge, so when it comes to identifying measurable trends based on large amounts of quantitative research, we recommend using a UX survey analysis tool like Maze to get the job done.

Transform your data into trackable trends and actionable insights

Maze makes automated reports based on your UX survey data, giving you key metrics that matter for actionable insights.

Types of UX survey analysis insights

UX surveys generate a vast amount of insights, which tend to fall into many different categories and sub-categories. However, broadly speaking, the UX insights you uncover through survey analysis will be one of three kinds:

  • Survey analysis that uncovers sentiment: These insights help gauge a user’s overall feelings towards your product. The analysis uses themes and patterns to identify any emotions your users express through open-ended responses.
  • Survey analysis that uncovers usability insights: These insights help you understand how users interact with your product. Do they they find it easy, difficult, or frustrating—why?
  • Survey analysis that uncovers satisfaction sentiments: These insights give you an idea of how users feel about the overall user experience. This could be related to your product, brand, or team.

While these aren’t the only analysis and insight types, they’re the ones you’re most likely to collect from a UX survey. Regardless of insight types, watch out for some common issues during the UX analysis process.

Common mistakes to avoid in your UX survey analysis

Analyzing your data correctly is critical to achieving accurate insights. Otherwise, you could draw skewed conclusions and implement the wrong changes to your product. Here are three common mistakes to avoid when analyzing your UX surveys.

1. Using poor data collection and organization practices

The larger your data set is, the more you’ll need to keep it neat for efficient analysis. Things like inconsistent data formatting, duplicated entries, and corrupted or unclean responses can all contaminate your data set, leaving you with inaccurate insights and findings that don’t reflect your users' real experiences.

Consolidating and keeping your data neat and tidy is much easier with a UX survey tool, as the right UX research tool allows your team to easily access, read, and navigate entries. For larger data sets of 1000+ respondents, consider using a specialized UX survey platform rather than exporting to an external database.

2. Asking questions to confirm your preconceptions

When conducting a UX survey, you want to be as objective as possible and avoid any types of cognitive bias. Confirmation bias—in which you tend to ask questions that search for answers that confirm your preexisting beliefs—in particular is your nemesis in UX surveys. If let in, is stealthily skews your data and insights and can jeopordize your whole project.

Confirmation bias doesn't just occur when formulating questions. It can also slip in when interpreting open-ended responses. Some UX researchers may fall into the trap of (subconsciously) selecting and focusing on feedback that agrees with preconceived beliefs, while ignoring responses that don’t.

To avoid confirmation bias when analyzing data, ensure you’re looking at all responses, and not disproportionately favoring certain findings over others. It helps to have a wide variety of UX researchers analyzing data, ensuring data integrity and objectivity standards are met. Needless to say, using a UX survey analysis solution is also a quick-fix way to avoid subconscious bias creeping into your data analysis process.

Pro tip 💡
Even the best UX researchers can fall prey to cognitive biases. Utilize the technology available to avoid and identify biases. Maze’s AI-powered Perfect Question helps you spot and eliminate bias from your research questions, while providing bias-free, re-phrased alternatives.

3. Jumping to conclusions based on quantitative data

With new data on your user’s experience, you may be eager to implement changes in UX design. However, it’s important not to jump to conclusions. While quantitative data can help identify issues and trends, it doesn’t uncover nearly as much context as open-ended questions.

For example, analyzed data might show that 75% of users aren’t satisfied with your e-commerce platform’s product search usability. It might be tempting to address the issue outright—but don’t touch that algorithm just yet. The problem could stem from poor product categorization, a confusing interface, or many other reasons. Only further research will help identify the route of the problem.

It’s always best practice to hold off acting until you can understand the why behind a problem, to avoid wasting resources creating the wrong fix, or simply putting a bandaid on a bigger issue.

What should you do once UX survey analysis is done?

With your analysis complete, the only thing left to do is create a UX research report to share your insights with stakeholders and get any necessary buy-in for proposed solutions.

If you’ve concluded that a particular feature is lacking or unusable, you and your team can figure out how to solve the issue the user is facing. Of course, any changes should be informed and validated by further research. The best research happens continuously, so you can inform future decisions, monitor user experience, and build a truly user-centered product.

Continuous product discovery and comprehensive UX analysis is essential for staying user-centric and understanding your users’ unique experiences. However, analysis can also be time-consuming for even the most savvy UX team. Choosing the right tool to support you can make the process more efficient, allowing you to reach insights quickly, with little chance for error.

In the next chapter, we’ll explore how to select an adequate tool for designing, conducting, and analyzing UX surveys, so read on to find your perfect match.

Frequently asked questions about UX survey analysis

How do you analyze a UX survey?

You analyze a UX survey by collecting your data in one place, reviewing it, and assigning codes and themes before extracting findings and insights. UX survey tools also help research teams calculate measurable trends, user sentiment, usability scores, and customer satisfaction.

What are the most common mistakes in UX survey analysis?

The most common mistakes in UX survey analysis include not organizing and preparing data properly, selectively focusing on information that confirms your hypothesis, and jumping to conclusions or implementing design changes before fully-reviewing qualitative data.

How do you use survey analysis to improve UX?

Improving UX through survey analysis involves using insights extracted from data to create and implement potential UX solutions. For example, if user survey results indicate users cannot find a purchase button, researchers can uncover why this is and UX designers can make the purchase button more visible or accessible.