Online Surveys 101: A User Researcher’s Guide
So much more than the premise of Family Feud.
As part of our UX 101 education series, where we discuss the different types of studies and research methodologies you can use with the UserZoom platform, we’d like to introduce our readers to the exciting world of online surveys.
What are online surveys?
I know, I know – but this is a 101 article after all so let’s go ahead and dive in with the introduction. Online surveys, simply put, are a series of questions that are used to gather quick information about consumer attitudes, preferences, sentiment, past experiences, or overall impressions of a brand or product.
They are often used for exploratory purposes or as the first level of inquiry that can be used to dictate larger research initiatives. Online surveys can also be used to gather descriptive information such as attitudes, preferences, brand perception and experience in order to better define and understand your customer or user base.
What are common use cases for online surveys?
Online surveys are used in a wide variety of ways. Examples include:
- Basic information gathering – Gathering simple data such as demographic information, general impressions or preferences.
- Voice of the customer (VOC) – Giving customers an avenue to leaving feedback about your website and/or products.
- True Intent – Intercepting live visitors to your sites or apps in order to learn more about their intent and reasons for visiting.
- Usability scales and indexing – Questions that allow users to rate usability metrics like organization and appearance of the site, or trustworthiness and value of the content on the site, for example.
- Exploratory purposes or to inform larger research initiatives – Gathering in-depth information about users’ preferences, impressions, intent to purchase and persona building are some common examples of this.
As you can see from the above examples, online surveys can be used on your own on your site or product, embedded within a research study, or can be completed long before your website, app or product is farther along than a napkin sketch.
How do online surveys work?
The general process for setting up an online survey is the same as most research engagements.
First and foremost outline what you want to learn in a research plan. Write out the purpose of your survey, define the concepts you want to measure and come up with your hypotheses. This is an important step that will help you identify the type of survey questions you want to include and ensure that you measure what you actually intend to measure.
Once you have your research plan in place it’s time to decide on the length of the survey and the types of questions to include.
One benefit of online surveys is the flexibility with length. Depending on the purpose of the survey, it can be short and sweet (making it easier to get larger sample sizes) or it can be longer and more detailed.
VOC and True Intent surveys are typically short. Visitors didn’t know they were going to complete a survey when coming to your site and are more likely to drop off if there are too many questions. The shorter the better here.
Exploratory surveys tend to be longer and more detailed. Often exploratory surveys contain advanced logic that is dependent on previous responses. In some of these studies participants may see questions that only pertain to them. In this situation, the logic can help control the length of the survey for any given participant.
Overall we see the best results for online surveys that take no longer than 10-15 minutes to compete. Anything longer and you risk participant burnout (often leading to decreased quality of responses) and higher rates of drop-off.
There are two main question types when it comes to online surveys: closed questions and open questions.
- Closed questions are questions with a predefined range or set of answers to choose from (rating scales, ranking, multiple choice, etc.). These will provide the bulk of your metrics.
- Open questions are typically open text fields or video questions where participants generate the response rather than select a response. These help provide context around the metrics.
What results do you get?
The results can largely be broken down into two different types: quantitative and qualitative depending on whether you chose closed or open questions, respectively.
Starting with the latter, qualitative results are going to be in the form of text or a video response. Either way you will get verbatim responses from your participants.
Quantitative results often depend on the types of questions included in the survey, however, counts and proportions are the most common output for most question types. Rating scales typically include a mean score in addition to the counts and proportions for each rating. More advanced researchers might want to look at the interaction between two or more variables in which case the output might result in a t or F value or plots on a line graph.
Tips for analyzing your survey results
The analysis of survey data will be dictated in part by your research plan and the intent of the survey, as well as the type of data you collected.
Insights derived from Closed question types are often reported with percentages or mean ratings (or both).
- Single or multiple select questions are typically reported in percentages: “70% reported the use the filters when searching for a product on an ecommerce site.”
- Ranking questions are also typically reported in percentages: “50% ranked option 1 as their most preferred.”
- Rating scales are typically reported by the mean score, however, many researchers also like to include the percentage of those who gave a high or low rating: “Participants indicated they are less likely to buy option B, (mean rating of 3.5 on a 7-point scale, 40% providing a low rating of 1, 2, or 3 on a 7-point scale).”
Insights derived from Open question types are qualitative in nature and are most often used to support the quantitative metrics gathered in the survey. Sometimes, researchers will choose to quantify the comments by identifying common themes and reporting the percentage of participants whose comments fall into a particular theme.
For example, “60% of those who gave a low rating indicated that they didn’t care for the type of offer they received.”
This concludes our introduction to online surveys. Thank you for reading and don’t forget to check out the rest of our UX 101 education series to help you on your way!
Sarah Greene is a UX Researcher at UserZoom with a background in psychology and education policy research who primarily supports customers in the e-commerce and travel industries. She is excited by the overall process of utilizing data-driven insights to inform business and policy decisions. When not busy designing studies and gathering insights, she can be found flexing her creative muscles creating jewelry or crafting one of her many DIY creations.
Thank you for signing up!
You should receive an email from us shortly