William Blake once said: "hindsight is a wonderful thing, but foresight is even better."
This phrase feels particularly apt for the world of user research.
After a research project is over, it's normal to look back over your study and analyze what went well and what could've been done better. This growth mindset helps us to optimize the research process.
But what if you didn't have to go through the process of learning from your mistakes? What if you could hit the ground running, confident that your usability tests will be impactful and valuable?
Well, that's exactly what this section is about. We've distilled our key learnings from conducting hundreds of usability tests into a handy list of tips and tricks. Forget the gift of hindsight! Here's the gift of foresight for you.
We can't emphasize enough how vital preparation is to successful usability testing.
The template and checklist we shared in the last chapter should form the foundations of your approach.
As well as these, we also recommend considering a few other factors, including:
Inviting the *right* people to participate in your study is crucial to uncovering beneficial insights.
If the people you recruit aren't the sort of people who will use your product in the first place, the whole test will be a waste of time or, even worse, could steer your product decisions in the wrong direction.
So use screener questions to whittle down potential participants to the most valuable people. Besides personas, you should also add screener questions related to your technology requirements. For example, do participants need to use a specific browser?
We're also strong advocates of over-recruiting by at least a few participants—just in case people drop out before the test!
Your goal is to make sure your participants understand the technical side of what you need from them in advance. Ask them to install whatever software you're using well ahead of time, and be very clear and concise in your instructions.
Reminders are essential as well, and thankfully there are some tools out there that you can use to make this easy. Calendly and YouCanBook.me are great, and they have built-in automated reminders.
One more thing! Be considerate of timezones if you've recruited participants from different countries. No one wants to do a usability test at 3 am!
How you practice for your test will depend on whether you've gone for the moderated or unmoderated approach.
If you're going down the moderated path, you should plan what you're going to say in the sessions by writing your discussion guide and practicing how you will administer tasks with your testing tool.
For unmoderated research, it's worth putting yourself in your participant's shoes and reviewing your tasks and follow-up questions from this perspective. This ensures that you've provided solid context around each task, so your users approach it with the right perspective.
To establish rapport initially, we recommend you have your camera turned on. Yes, you! This helps because the participant sees an actual person when they join.
Remember to speak to your participant like you would any other person!
Avoid any potential 'corporate' jargon that implies laziness and a lack of thinking, such as 'tell us what you think'. These phrases don't build a sense of community since they can make the moderator appear distant and disengaged.
Ask users to think out loud
You can encourage users to tell you what they're thinking throughout the session. Talking through a thought process is not necessarily natural for everyone. Some users may need extra encouragement.
Mentioning at the beginning of the session, "Feel free to tell me any positive and negative thoughts as we go along", will help clarify the type of feedback you want without leading users.
Ensure there is enough time between sessions to reset
You might want to talk to your stakeholders in between sessions, or your session might run over. It's therefore vital to have a buffer.
We recommend 30 minutes to help you reset and get ready for the next session. Depending on how intense the sessions are, you may want to make that buffer closer to an hour.
Expect the unexpected
No matter which techniques you choose or how much you plan, things may not always go according to plan.
Some users will tell you every thought they have, while others will hardly say anything. Some will be excited to tell you their life stories, while others will be anxious about your questions.
Know that moderating a wide variety of people and conversations takes a lot of cognitive and emotional effort. So, don't panic if things don't go exactly to plan. Stay flexible, adapt to the situation and, most importantly, don't forget to breathe!
Focus on quality, not quantity, of insights
Usability tests can lead to hundreds of insights being uncovered. Between behavioural and attitudinal metrics, it can be easy to become overwhelmed by the amount of data you have to sift through.
Once you've identified outputs as insights, it can be helpful to code each insight across participants to help draw out overall themes.
If you use our platform, you can tap into the power of our single UX metric score, called the QXscore.
This is a "quality of experience" score that combines various measurements, collecting both behavioral data (such as task success, task time, page views) and attitudinal data (such as ease of use, trust and appearance)—the purpose of this is to create a single benchmarking score for your product.
This single UX score is a simple, clear and persuasive tool for communicating user research results to stakeholders and should help with getting future buy-in.
Drive positive change
Congratulations! You carefully planned your remote study, recruited all your participants, conducted the sessions and analyzed the results.
Now comes the most crucial part: presenting your findings to stakeholders and proving UX's value to the company.
If you're looking to make a long-term impact with UX, we recommend you jump on the UX benchmarking bandwagon ASAP.
Benchmarking creates a baseline for understanding the current user experience on your website, app or any digital product.
The sooner you get started, the sooner you'll be able to measure and improve the user experience in a language that your stakeholders understand.
Each year, we receive hundreds of questions regarding best practices and tips for conducting usability tests. Click next for our answers to your FAQs.