Usability testing is a crucial step in making sure your app is straightforward, intuitive, and easy to use.
As a designer or developer, it can be easy to think that your design is simple because you’ve worked on it for so long. Usability testing helps validate your ideas, spot and fix cognitive biases, and understand user behaviors and pain points.
There’s just one catch.
For your user testing to work effectively, you have to ask effective usability testing questions.
Good usability testing questions bring out the most valuable insights from your test participants without leading them in a certain direction, creating biases, or overwhelming them.
Here at Bubble, we’re big fans of usability testing — both as we build our own no-code tool, and to help Bubblers test their own apps built on Bubble. With more than 3 million apps built on Bubble’s platform, and after decades of research experience from our own UX team, we’re here to walk you through how to create the best usability testing questions for your own product.
In this article, we’ll show you the best practices for writing user testing questions and give you 57 example usability test questions you can use to get started.
How to write good usability testing questions: 6 best practices
Learning how to write usability testing questions is as much an art as a science.
Although it can help to get started with some sample questions, understanding some of the best practices for usability testing questions will help you make your usability test more effective.
Ask open-ended questions
Open-ended questions are broad. They don’t prompt for a specific answer, and they leave space for the user to share their thoughts and feedback.
They’re generally much better to use compared to yes/no questions. They allow participants to share openly, rather than creating a dead end.
For example, visualize how different questions look in a yes/no format versus an open-ended format:
Ask test participants to think out loud during the test
To get the most value out of your usability test, ask questions that get participants to “think out loud.” Not only will this give you more insight into user behavior, but it can also provide helpful insight that you can dig into further with a great follow-up question.
“Thinking out loud” can look like:
- Explaining what they’re thinking as they move through a task
- Asking questions out loud as they come across anything unclear (you don’t have to answer them!)
- Commenting on each screen or page as they navigate to them
Avoid leading questions
Leading questions are a big no-no during user testing of any kind.
Good usability questions are open-ended and help the participant share their honest opinions, thoughts, questions, and feedback. Leading questions subtly push the participant to respond in a certain way, which can bias test results.
A few examples of leading questions:
- Did you like the first screen better than the second? (Prompts the participant to agree with you)
- What did you think of the simple, clear welcome screens? (Biases participants that the welcome screens were simple and clear)
- How much did you enjoy the process of _____? (Assumes that they did, in fact, enjoy the process)
- How much easier was the second version? (Assumes the second version was easier than the first)
- Do you understand what the header is saying? (Participants may be embarrassed to say no if they don’t understand)
All of these questions subtly nudge the participant to respond in a certain way that may not accurately reflect the user's thought process.
Give all participants the same information
A great — but difficult — practice is to make sure all your test participants have the same information.
Create specific, detailed, and clear instructions for your test that all participants receive so that everyone is set up equally. Then, avoid providing more instructions or information during the test — even if participants ask for it.
For example, if a participant asks a simple question about the test — like, “Do I click this next?” or, “How do I choose which option I want?” — don’t answer! This can bias the results of your testing.
One of our user research managers at Bubble, Peter Leykam, was quick to advise against answering questions, even if it feels rude:
“When conducting usability tests, you really want to give people as little context as possible. You shouldn’t explain anything during the test itself, or even answer their questions as they arise. When people ask for an explanation, ask them instead what isn’t clear, or why they feel confused. When they ask clarifying questions, ask them what led them to ask, or what knowing the answer would help them do. At the end of the test you can explain anything you need to, but during the test itself you should tell them as little as possible.”
It can also help to remind test participants that you’re testing the design, not them! This can help people feel more comfortable trying different solutions, making mistakes, or giving you more honest feedback.
Ask different types of questions
Of course, you want to ask lots of questions to gather as much data as possible. One key to good usability questions: Switch it up! By asking different types of usability questions, you can dig deeper into your participants' answers, get more valuable data, and find the most important common threads and results.
For example, you can structure questions in all kinds of ways:
- Multiple-choice questions: Participants choose from a list of options
- Sliding scale: Participants choose their answer on a scale of 1–5 or 1–10
- Ranking questions: Participants rank options from best to worst, easiest to hardest, etc.
- Yes/no questions: Participants answer with a simple yes or no (best for clarification)
- Open-ended questions: Questions that don’t have a specific answer, and invite participants to share thoughts and feedback
- Follow-up questions: Questions that help the participant expand on a previous answer or task
- Probing questions: Questions that help you gain more information on why a participant responded or completed a task in a certain way
- This or that questions: Participants choose from one of two options (i.e., “this” or “that”)
Sometimes, certain questions will resonate more with one participant than another, so asking different types of questions helps you get the most data.
Break down your questions to make them easier to answer
Sometimes when doing usability testing, there’s a general question or theme you want to ask about. However, asking broad questions doesn’t always yield the best results.
Instead, break them down into smaller, more specific questions to make it easier to get detailed answers from participants.
For example: “Why do you shop online?” is a super-broad question that might leave participants unsure of how to answer. Think about what you’re hoping to learn from participants' answers to this question, and then break it down accordingly.
This might result in questions like:
- When do you choose to shop online vs. in-person?
- What types of products do you buy online?
- What do you like most about shopping online?
- Which online retailers do you shop with the most? Why?
The more specific the question, the more specific the answers — which can help you get more valuable data.
57 sample usability testing questions to try
When you’re getting started with your test, it can help to keep some common usability testing questions on hand to give yourself plenty of options for your research.
We’ve broken down our sample questions by the four stages of usability testing:
- Screening (finding participants)
- Pre-test
- During the test
- Post-test
Think about when you’ll ask questions to make it easier to get valuable data throughout.
Screening questions
Finding participants isn’t always as easy as it sounds. Besides, some test participants will be more helpful to you than others, so you want to screen your potential participants carefully.
For example, you want to have participants that fall within your target audience, but have a variety of backgrounds, experiences, and knowledge about your product. You also may want to be surveying a specific demographic of your audience, or you might want a wide range of user demographics.
Screening questions allow you to create the testing pool that'll give you the right data. When screening, you might want to start with demographic questions, such as:
- What age group are you in?
- What’s your relationship status?
- What’s your household income?
- How would you describe your ethnicity?
- How would you describe your gender?
- What is the highest level of education you’ve completed?
- What's your occupation?
Providing ranges or multiple choices for these questions can make them a bit less personal and help participants feel comfortable.
You may also want to ask some questions about participants' background knowledge with your industry or product type. If you’re testing a prototype or early product, some good background screening questions are:
- When was the last time you did [target action]?
- What device do you use to do [target action]?
- How often do you typically do [target action]?
- What types of products do you use to do [target action]?
- How comfortable are you using an app to do [target action]?
- What experience do you have using [your type of product]?
The goal isn’t to ask all of these questions. Instead, use them as a starting point to understand how familiar users are with your product and solution, or similar solutions.
If your product already exists and you’re doing a usability testing session with current users, you might ask questions like:
- Have you ever used our product?
- If yes, how often do you use our product?
- If not, are there any products you use similar to ours?
- What other products similar to ours do you use? (A list of multiple choices can be helpful here)
- What features do you use most in our product?
- What’s the main task you do with our product?
- How often do you typically use our product?
Pre-test questions
Once you’ve screened and chosen participants, asking questions during each stage of the testing process can help you get more insights from your test.
A few pre-test questions can help you gauge the participants’ familiarity with your type of product and with the test they’re going to do. This can help you accurately chart the usability of your product for different types of users (i.e., “power users” vs. beginners).
Some pre-test example questions would be:
- How often do you use [your type of product] to do [target action]?
- The last time you did [target action], how did you complete it? This can give you clearer insight into how users typically solve this problem, and what other solutions you’re competing with.
- Have you ever used this [app or website] before?
- Which parts of the website do you use most often?
- What would make you decide to use [your type of product]?
- Can you describe your experience with [your tool / this type of tool] in the past? This can help you understand the background and experiences (positive and negative) of the participants.
These questions can help you understand your participants' experiences in the past. They can also help you understand users’ thought processes going into a task, what makes them use or choose different tools, and how effective other tools are in solving their problems.
Testing questions
Most importantly, you’ll want to ask good questions during a usability test. The “Goldilocks rule” definitely applies here: If you ask too many questions, you might overwhelm your participants or bias the results. If you ask too few questions, you won’t get as much insight.
One helpful theme is to ask participants about their opinions or feelings related to the design or layout of your product.
Although the user might be able to complete the task successfully, it’s always a good idea to ask questions about their opinions or expectations. There may be more helpful ways to present information or design the product, even if the current design works as-is. For example:
- What did you think of the explanations on that page?
- How do you feel about the information you received throughout that task?
- What’s your opinion on the way those features are laid out?
- What do you think about that pricing chart?
- Can you tell me what you think of _____ [any aspect you’re testing]?
You can also ask questions about specific tasks or the way users interact with your app.
- I see that you did _____. Can you say more about why you did it that way?
- Did you notice any alternative ways to do _____?
- If yes, why did you choose the option you did?
- If not, would there be another way to complete that would have been better for you?
- You seemed to hesitate on the final step of that process. What were you considering or thinking about then?
- How did you decide where to get started with this task? This can give you helpful insights into user expectations and thought processes to make your app more intuitive.
- How was your experience using this product to complete this task?
- Are there any steps you expected this process to have that weren’t included? This can help you spot-check for any oversights or steps that'd make the process smoother.
You can also use multiple-choice, sliding scale, or ranking questions to get more quantitative data during this phase. For example:
- On a scale of 1–10, how was your experience with _____ [any aspect of the task]?
- How was the process of [completing the task]? (Very confusing, somewhat confusing, somewhat clear, very clear)
- Can you rank the four screens that you saw in order from “most clear” to “most confusing?”
- Which of the following descriptors would you use to describe [the app/the task]? Give a large list of various adjectives to choose from. This can help create quantitative data from more open-ended responses.
- Which of these methods of completing this task do you prefer? Why? Mixing multiple-choice with open-ended questions can combine the best of both worlds!
For moderated usability testing, you can simply ask these questions as you progress through the test.
In remote testing, asking questions during the test is more difficult. Multiple-choice, sliding scale, and ranked choice testing questions are often easier to respond to in this case.
However, you can use open-ended testing questions during unmoderated testing. Just be sure to keep the questions broad (“What did you think about…?” or “How was your experience with…?”). Don’t ask too many to invite users to give you more detailed feedback.
Post-test questions
Once the user testing is complete, you may want to end by asking a few questions about the overall experience or gather broader user feedback. With the usability test behind them, users may have more high-level suggestions to share.
You can ask specifically about any overall usability feedback they might have. For example:
- What do you think of the [app/website/product] overall?
- What did you like most or least about this product? Why?
- What’s one thing you'd change about the process of [completing the task they just did]? This invites constructive feedback, even if the participant has been positive about the process so far.
- What questions came to mind as you were completing the task? This helps you see where your workflow is confusing, unclear, or too complicated.
- How would you compare [this product] to [a competitor’s product they’ve used]? This helps you see how your product compares to other solutions your audience is using.
- Is there any other feedback or suggestions you’d like to share before we finish today? An open-ended question like this to finish gives participants a chance to share further feedback that you may not have thought to ask about.
You can also ask them to reflect on how they might or might not use this product in real life:
- How would you describe your experience with this product to a friend? This can sometimes help participants reflect on their experience more honestly, and give you better feedback on what they’re looking for.
- If this was a real app on your phone, how likely would you be to recommend it to a friend? Why or why not? A sliding scale can make this question easier to quantify.
- What do you think you'd use this product to do in real life? This can give you new feature ideas, as well as help gauge customer expectations for your product.
- Are there any features that'd make you more or less likely to use this product?
If they’re already users of your product, you can ask them how the feature or process they tested might change their usage of the product:
- If this feature were available in your app, do you think you'd use the app more, less, or the same as you do now?
- Is there anything you wish our app could do that it doesn’t currently?
- How was your experience of completing this task today compared to other tasks you use the app for? This can help you gauge whether the new task or feature is about the same, easier, or more difficult than other tasks users complete with your app.
Test and iterate faster with Bubble
At Bubble, we’re on a mission to make every step of the app development process smoother, faster, and easier.
From designing and developing your app, to doing prototype testing and usability testing, to launching and growing your app and audience, Bubble makes it better. Our full-stack, no-code development software allows anyone to build functional apps, no programming experience needed.
When you’re ready to test and launch your app, Bubble’s built-in testing tools, integrations, and custom APIs make it easy to integrate usability testing software right into your app and collect the data you need.
Ready to start building and testing your app? You can do it on Bubble for free until you launch.
Or just get more resources on building, testing, and scaling your apps. Our email newsletter and welcoming community gives you the support you need to make your apps possible.