Get organized and ease your customers' frustrations.
In this practical guide we’ll show you how to improve the user experience of your Interactive Voice Response (IVR) system by using common user research methods.
Traditional information architecture testing has been used to test menus on websites and apps. These types of studies usually take the form of Card Sorts and Tree Tests.
However we can also use these types of studies to organise the different options that users face when they call a customer service number and have to interact with a machine (also known as Interactive Voice Response, or simply IVR).
Everyone has experienced some level of frustration when using a machine, it feels like machines just can’t understand the problem we’re trying to solve. Often these frustrations come from poorly designed structure and messaging.
The issue with IVR systems is that, as with many other customer-facing functions, they’re usually designed without involving users in the process. The different paths that a customer who’s trying to solve a problem can take, likely follows a structure based on how things are organised internally. This organisation doesn’t necessarily make sense to your customers.
Even if there’s no interface, user experience design should also be applied to optimize a ‘non-visual’ service like this. A key factor of UX design is research, and a lot can be done with users to help design an IVR service that is less likely to leave users frustrated and dissatisfied.
Think about the structure of the IVR system as the structure of a menu on your website. There are different levels, going from generic to specific. The first level would be where the user specifies broadly what they are looking for, like the main menu on a website.
For example, a customer who has problems with their Internet connection and calls their provider, could get something like:
Tell us the reason why you are calling:
Once they choose an option, they’ll get more specific ones related to that first option, and they’ll continue down a narrower route until they find what they’re calling for.
Just like when we’re redesigning our website’s menu, we can take two different approaches.
If we’re in a more exploratory state and want to see how our customers would arrange the different options that we’re providing in our IVR service, we could do an open card sort to look into this.
We would give them the different options and ask them to group them in a way that makes sense to them:
That way we’ll see which options they would expect to see grouped together and we can begin to understand why they relate to each other.
They will also give a name suggestion for the grouping and they can explain why they have grouped certain items together once they’ve finished the exercises.
We could then dig deeper with a follow-up questionnaire and check if they’re struggling to understand any of the options. That way, we could look into renaming them a different way and whether that would help with comprehension.
The comprehension of the different options is very important, so much that it should have its own separate study. We could run a survey, giving them the different options and ask them what they would expect to be able to do in them.
If you have a proposal for a structure, you could do a closed card sort instead to see if participants would classify the items in the same way as you would.
In a closed card sort, you give them a specific number of categories and they would need to decide where each option belongs.
Once the design of the structure is complete, we can conduct a tree test to verify that they can get to the end of each of the journeys.
The structure is usually divided into levels, so we could simulate the paths they could take when interacting with different options in the IVR system.
It would look like this:
The tree test will help us verify if the users are taking the right paths to achieve what they want and if they are doing it directly. If they aren’t, we’ll be able to understand how confusion is arising.
Users having direct success (getting on the correct path the first time) is particularly important when it comes to IVR because going back is not as easy as it is on a website.
We can run a qualitative version of both the card sort and the tree test, recording what they are thinking out-loud while they’re organising the different cards or looking for the right option in the tree.
This will help us understand the reasoning behind their decision and if there’s anything they don’t understand or are unsure about.
One of the main challenges when interacting with an IVR menu is the lack of visibility of the whole menu structure. You don’t first see all the options and then choose one, you get them sequentially and need to decide if the one you’ve heard is the right one or not, without knowing what else is coming.
We can test how the order affects the ability to reach the correct point successfully with a simple survey.
We would first present a specific scenario, for example: ‘Your internet has suddenly stopped working and you call your provider to get help’. Next we can give them the first option in the IVR system and ask them what they would do next (either choose that option or continue listening).
If they say they would select the option, we could then show them another question with its nested choices, with an alternative to go back if they can’t find what they are looking for.
That would help us determine if any of those first options have any ambiguity attached to them that would make users take the wrong path first and then need to recover.
I hope that all of the above helps with the realisation that just because an interface isn’t visual or ‘tangible’, it doesn’t mean it can’t be tested in a relatively straightforward manner, using just a few of the common research methods at your disposal.