Chapter 3

How to write an effective usability testing script (+ example)

To get the most out of a usability study, it’s vital to create a usability testing script. With this script, you’ll be able to run a successful usability test. Here, we'll show you how to sculpt the perfect usability test script.

Why you need a usability testing script

Usability testing is made far more effective when using a test script. There are numerous benefits of scripting out the test, including:

  • Being able to review the tasks and questions with colleagues beforehand
  • Having a roadmap for the things you’ll say, so you don’t have to think on the fly
  • Keeping the methodology consistent, with the same questions and tasks
  • Determining how long a session takes, so you can properly time the meetings

All in all, having a well-thought-out script for your usability test is a must.

Before you create the test script

First things first. You’ll need to determine the scope of your usability testing research.

Get a clear vision in your mind of what you want to test. In short, you want to map out the specific tasks you want the participants to complete.

You shouldn’t be going into this with a broad objective of “getting feedback on the product.” It’s too vague of a goal. The results you get will be scattered, possibly inconsistent, and difficult to get insights from.

Instead, the test should cover realistic actions that users will take with your product to help you detect usability problems and see if your product is easily understandable. A better goal, for a language-learning app, for example, would be “see if the user can start a new conversation in their target language.”

You also want to outline who the testing users will be, as well as how many test participants to include.

Finally, remember to get consent from these users beforehand. This is usually done with a consent form. And if you’re running a moderated usability test, make sure the participants are aware that you’ll be seeing their screen and hearing their voice while they’re testing your product.

Get usability testing insights in hours, not days

Maze is a product research platform that enables you to collect both qualitative and quantitative usability data, all in one place.

Creating a usability test script

To make your usability test go well, your script should:

  • Have clear objectives, and create user tasks which test those objectives
  • Be reviewed by your peers, and modified based on their feedback
  • Be well-timed, ideally around 30 minutes or so. If this is the first usability test you’re running, then aim to keep it shorter than longer.
  • Be tested by a colleague before giving to real usability testers

Let’s break the actual test script down into four sections: Introduction, background questions, tasks, and the wrap-up.

Step 1 - Introduction

The first thing to do is to outline how this usability testing session is going to go. Especially if this is the test user’s first time participating in a usability study, it’s good to get on the same page. You can set expectations, let them know how long the process will take, and clarify any issues they might have.

A usability test isn’t performed by robots. It’s people completing tasks being monitored by other people. Back and forth communication is possibly the most important part of a usability test. So, you want your test users to feel comfortable with you before they start the test itself, and getting off on the right foot as early as the introduction is the best way to start a positive rapport.

In fact, this goes for before and during the test. The user should be well-informed as they enter the test, and having the ability to communicate while testing can also be helpful.

So, introduce yourself and your team. Break the ice as much as possible. Try to find a place you’ve both visited, an activity you both enjoy, or shared career experiences.

You don’t have to invite them for dinner, but if you can build a nice rapport before the test starts, then their feedback will be more honest, and the results will be better for it.

Step 2 - Background questions

Once you’ve introduced yourself properly to your participants, you’ll now want to get some background information on them. This group of information will help to better inform you of why their experience went the way it did. Try to found out:

  • Their job, and the tasks they do in it
  • Demographics questions (careful not to pry into irrelevant and personal details, though!)
  • Their experience with products similar to yours
  • Their experience with this usability testing platform
  • Any other data that’s pertinent to your testing

Try to keep questions about their experience as open-ended as you can. The objective here is to gauge their level of understanding in general. E.g. “We have a tool called “troubleshooting”, what do you think this would do?”

Tip ✨

Remind the users that you’re interested in testing the software, not them. They’re not being judged on their abilities. If you can clearly reaffirm this, you may put some anxious testers at ease and help combat cognitive biases like social desirability.

Finally, you want to double-check that they’ve given their consent. You have a legal and moral obligation to ensure they’ve consented to this study, and that they understand fully what the study entails.

Step 3 - Scripting the usability tasks

This is the stage where the testing actually begins. There are countless variables that can make the test go better or worse, and a lot of this depends on your usability method. If you're using heuristic evaluation, for instance, your script may look a little different. This section is a general outline for how to write well-scripted usability tasks.

  • Stick to eight tasks at maximum. This might not seem like much, but even an eight-task test is going to give you valuable insights about your product. If you have more hypotheses to test—you definitely should, by the way—then test them in the next round of usability tests.
  • Your tasks should reflect realistic user goals. You’re not aiming to test the most niche use of your product, nor finding ways to break the program. That’s the job of a QA.
  • Don’t tell them the path to take. Instead, tell them where you want them to arrive, and see if they can achieve it without your guidance.

Let’s use the example of a language app that matches native speakers with people learning their language:

  • Objective: User should start a conversation with a German speaker
  • Poor task description: "Go to the new conversation page, add German as the preferred language, and find an appropriate user to start a conversation with."
  • Better task description: "You want to learn German, so you signed up for a language app that matches you with native speakers. Find someone to speak German with who lives in your timezone."

The language you use when giving instructions is important, and the details are also nuanced. Subtle hints in commands can influence the way users engage with the program. Especially if they feel you’re trying to get them to perform a task in a particular way, they will have a different experience and perspective of the task itself.

Your tasks should also follow a logical and realistic sequence. You don’t want to send the users between pages at random to perform tasks. Instead, consider a sequence of tasks more like:

  • Send the user from the main page to a sub-page
  • Then, review some part of that sub-page
  • Finally, complete a new action within that page

If your tasks can’t be arranged to follow a neat sequence, then at least keep them clustered to an element of the product.

Make these tasks as direct and plain as possible. Some other things to avoid:

  • Using a “salesy” description. Example: "Choose one of our powerful new AI-powered tools to check for grammar mistakes in the text."
  • You’re not selling the product, so you don’t need to convince the user of anything. Be direct and clear with your language.
  • Possibly offending the user. Example: "You’re not smart enough to understand this page. Can you find the FAQ for answers?"
  • Don’t make things personal, stay away from insulting language, and avoid any topics that may be sensitive as much as you can.
  • Unnecessary backstory. Example: "Your wedding is coming up, and you’ve made a promise to get into shape for it. You tried jogging with your friends Jane and Mike, but you didn’t like it. You then heard about a good workout routine from your other friend Milos. You don’t have time to waste in the morning because you have to read the newspaper for your job, as well as making sure your children get to the school bus on time. So, finding a gym close to your office is important to you. Can you find 3 gyms close to this address?"

There is a difference between setting the scene and giving totally superfluous exposition. Tasks that resemble a real-life scenario are preferable, it even helps you to get realistic data. But, don’t give too many details, especially ones that aren’t actually relevant to the test.

Finally, make sure you’re able to communicate both ways during the test. Being able to record the voice of the user while they’re testing the product can be a huge boost. Hearing their thought process gives a whole new layer of context, and can better inform you of why things go wrong.

Step 4 - Wrap-up questions and feedback

Once the test has finished, the final step to take with your test users is to get their more general thoughts on the process. This is the final set of usability testing questions that you’ll be giving to your participants. This gives the chance to soak up any details about the test user’s experience, so make the most of it!

Aim to ask follow-up questions about:

  • Their overall impressions of the product, and of the session
  • What works well/poorly in the product
  • Overall difficulties they had with the tasks
  • Any comments they wanted to add during the test but didn’t

Example of a usability test script

To finish off, we’ll give a brief example of a usability test script.

Let’s say you’ve created a program that helps users to find the perfect movie to watch. The program has a huge range of custom filters, like original language, scenes of violence, the emotional tone of the film, and so on. You can add profiles of your family to your account, and choose which filters to add for any particular search.

Your objective of a usability test in this example could be: The user can find a movie that’s suitable to watch with their young children. Your test group for this demographic is: Parents of young children.

Once you’ve set up the screen sharing, you’re ready to go.

Let’s look at an example usability test script for this movie-searching program.

Introduction

Hi Maria, how are you? I really appreciate you taking time out of your day to participate in this test. I am Susan and I’m a researcher at Maze.

(If you are with colleagues, you should also introduce them to the test user, too.)

So, let me outline how this will go. I’d like to start by asking you some questions about who you are, your background, and your relevant experience. I will then ask you to perform some tasks on our language-learning app. Once the tasks have been completed, I’d like to get some feedback from you about your experience with our program.

We’re doing this usability test to see how users interact with our software, and to hear their thoughts on it. We’re trying to make this the best it can be, so your honest thoughts are really important to us.

Is there anything you’d like to ask before we get going?

Finally, I’d like to make sure you’re comfortable with us recording today’s session. Is this okay with you?

Fantastic, so I’ll now start recording the audio and dive into some background questions.

Background questions

So, Maria, could you tell us what your current job title is, and a brief overview of what your job entails?

(Here, you’ll want to get through the list of background questions you have prepared.)

Tasks

Thank you for your answers. We’re now ready to start the test. Before we begin, I’d like to remind you of a few things.

First off, remember that we aren’t testing you today, we’re testing our program. So if something isn’t working, don’t worry, it’s a problem with our software and not something you’ve done wrong. In fact, there are no wrong answers here.

When using the program, try to act as naturally as possible. I get that it’s hard to do that with us watching your screen. But, please try to act as if you were using the app on your own, without anyone watching.

Please think aloud as you’re using our program. We really want to hear your thoughts, like where you’re navigating on the page, why you’re clicking somewhere, what you expect to happen when you do click, that sort of thing. If you have questions, feel free to ask me, and I’ll answer all of them I can.

Finally, we’d like you to be as honest as possible. If something doesn’t make sense on the page, or it’s not working right, then feel free to tell us. You’re not going to hurt our feelings, so don’t worry about that.

Great, so let’s begin. I’ll now start recording your screen.

  • Please take a look at the main page, and tell me what you’re seeing.
  • Okay, now I’d like you to create a profile for your child. You don’t have to use real information, you can create fictional details.
  • Next, I’d like you to set some filters. You only want to see results that are suitable for kids.
  • Now, please find a movie that your child would enjoy watching.

Wrap-up

And that’s the final task finished! I’ve stopped recording your screen.

Before we finish, I’d like to ask you a few quick questions.

Firstly, what did you think of the homepage?

(Continue to go through the list of wrap-up questions, as well as any questions specific to this user about actions they took).

Thank you for that. Is there anything you’d like to add before we finish up?

Fantastic. Well, thank you again so much for taking the time out of your day to take part in this study with us. Your input today will be extremely useful for us. Take care, I hope to speak to you soon. Goodbye!

Frequently asked questions about usability testing scripts

What is a usability testing script?

A usability testing script is a plan of all the actions that you’ll need to perform to run a successful usability test. A script allows you to plan the usability tasks and questions and review them with your colleagues beforehand. It also acts as a guide during the test, allowing you to keep the methodology consistent and time the sessions properly.

How do you write a usability testing script?

When writing a usability testing script, we recommend following these four steps:

  1. Introduce yourself and your team to your test users and tell them how the usability testing session is going to go
  2. Prepare some background questions to get to know your participants and their level of knowledge about the product
  3. Script the usability tasks. Try to stick to eight tasks at maximum and include tasks that reflect realistic user goals and follow a logical sequence.
  4. Prepare a list of follow-up questions to gather insights and details about the test users' experience

How do you document usability tests?

At the end of the usability testing process, make sure you create a final report to share your results with the rest of your team. The report should include a brief overview of the usability test you ran, your research goals, the methods you used to perform the test, information about each test participant, and, most importantly, your findings. You can find detailed information on how to analyze and report usability test results in the final chapter of this guide.