Getting the most specific, actionable, and relevant user feedback often depends on how you ask the question.
When it comes to strong product development, user feedback may well be your best asset.
The challenge, though, is getting relevant, specific, and actionable feedback. Generalizations and abstractions simply can’t provide the kind of insight you need to remediate design challenges or capitalize on improvement opportunities.
In this article, we will describe some best practices for developing user feedback questions that get results.
You can’t design a good customer survey or an effective user feedback question if you don’t first understand what the question is going to be used for. Get clear on your research purposes before you start asking questions.
If your customer needs a Ph.D. in English Literature to understand your questions, you’re not going to get results. The same is true if the survey is so long that they’re going to need a lunch break to get through it.
You’re better off designing customer surveys that are brief, to the point, and written for mid-level readers (think middle school reading level).
If you want your research questions to be truly illuminating, then you need to understand not only what to ask but who. This means digging deeply into user behavior analytics to develop a clear profile of your target research subject.
If you find, for instance, that your end users rarely watch an online promotional video all the way to completion, then that’s an issue you would do well to build a question or two around. In other words, once you understand what your target audience does, you can start formulating questions that get to the why.
At the end of the day, user feedback questions are meant to provide a snapshot of a product or company’s weaknesses as well as its strengths. The insights you glean from the survey should enable you, for instance, to conduct a meaningful and accurate analysis of the strengths, weaknesses, opportunities, and threats (SWOT) evident in your organization, product, and/or processes.
And what that means, fundamentally, is that you’re going to need to inquire about frustrations, problems, and disappointments the user may have experienced with your product or service.
One of the biggest challenges in collecting quality data is controlling for confounding variables. If you’re conducting your survey online, you’re going to need to be proactive in excluding bots from your testing processes. Otherwise, you’re going to end up with a whole mess of unusable data.
But that’s not the only issue that can make a survey unusable. You will also almost inevitably come across a test taker or two who’s just going through the motions to get the reward at the end.
This is where user feedback questions can really come in handy. By throwing in a red herring or two, you’re going to be able to better identify those participants who are in a hurry, not paying attention, or just clicking responses willy-nilly. Just make sure that the questions are outlandish enough to accurately flag an unreliable survey or test taker.
If, for instance, a tester repeatedly provides a nonsensical answer to a simple question, the odds are they didn’t just accidentally click the wrong response. More likely than not, they've read few, if any, of the questions and therefore none of their responses can be trusted.
User interviews can be an ideal way to solicit actionable insights, especially if you use them in conjunction with other research tools, such as an online customer satisfaction survey.
The best user interview questions generally emerge organically from the conversation. As you listen, reflect, and follow up on your interviewee’s comments, you are likely to end up asking questions you never would have thought of beforehand.
That doesn’t mean, however, that there should be no method to your madness. In fact, if you want to get the most out of the interview, then you need to go into it with a robust and thoughtful discussion guide at hand. This will ensure that the interview has structure and that the questions have utility, even if you could not have anticipated the exact form the questions would take until you engaged with and were led by your interviewee.
Developing effective user feedback questions is not easy. You need to invest time, effort, and a good deal of strategy if the questions you develop are to provide the actionable insights you want.
To do this, though, it’s imperative to understand who your end users are, how they behave, and what methods will be most effective in communicating with and learning from them. This may include online surveys, in-person interviews, or, ideally, some combination of methods and tools.
Want to learn more? Make sure you check out our free moderated studies content bundle, a comprehensive guide covering everything you need to know about how and when to run remote moderated studies
Half the challenge in carrying out effective moderated user research is asking the right questions.