When you use survey questions in your user research, they should be well designed.
It’s really easy to make bad surveys, and plenty of companies do. Surveys are admittedly overused and often ineffective on their own — the self-reported data that you pull from survey questions is biased and has questionable accuracy. It’s typically much more valuable to gather insights from data on actual behavior and user interviews.
So, to start, remember that survey questions should be used as part of a bigger UX research strategy.
But surveys aren’t totally useless! Surveys can give you some quantitative data on user feedback and perceptions that you can track over time or use as comparisons. Survey questions can also give participants an opportunity to describe the behavior you may have witnessed during testing. The trick is you want to write surveys that will give you useful data and that people are more likely to fill out (without hating it).
This is a huge topic and I won’t cover more than a tiny fraction of how to write surveys, but, here are a few things to consider for writing survey questions that aren’t a waste of your time — or participants’ time!
The first thing to do, before putting together a survey, is to identify your goals for the research. For example, you could be testing the waters for a new product feature, you could be wanting to learn why people aren’t using a service you offer, or you could be wondering why customer satisfaction is so low.
Personally, I also use surveys early in the discovery phase of a research project. Sometimes my research goal is to get a sense of what people think and feel about a topic to decide what to research more deeply.
What are some helpful self-reported metrics that you could get from survey questions? By knowing your goals, you will better understand what kinds of questions will give you the information that is most important.
Human attention is very short. None of us tend to enjoy filling out surveys much at all, let alone when they are long and tedious.
You don’t want people quitting your survey halfway through because it’s too tiresome. That will leave you without data and those participants won’t be too happy either.
I have three recommendations for keeping surveys short.
In scientific research, we always pilot test our experiments before we run full-fledged studies. The pilot test helps us double check that our equipment is working, our instructions make sense, and that we’ve really ironed everything out before collecting our real data.
I suggest that you pilot test your survey questions before you send out the survey. Give it to a few friends or family members or send it out to just a few pilot participants.
Your goals with pilot testing are to:
If everything looks good from your pilot test, it’s time to send out your survey and start collecting the data for better UX decisions.
(Also, if you are interested in diving deeper into survey question writing, I made a quick and dirty Skillshare course on this topic!)