The Task-Based Benchmark Study

UserZoom ran a quick task-based benchmark study between Humana and Cigna, two well-known healthcare providers, to compare the online experience of obtaining a dental insurance quote.

– We ran the unmoderated remote task-based benchmark study with 52 in-the-wild users on their own devices over the course of a single day

– We split participants equally between each website so that 26 went to Humana and 26 went to Cigna

– The Task: Get the monthly quote for Preventative Dental Insurance in the 94111 area code

– We validated the task by asking them what the price was

Let’s Get Ready to Rumble!

Before participants visited their websites, we showed all 52 participants an image of Cigna’s and Humana’s homepage and asked them to choose which site they associated with several UX attributes:

Screen Shot 2015-07-01 at 10.45.19 AM

The participants felt the websites were fairly even in most categories when based upon looks alone, with the two most noticeable exceptions being the Lively and Informational categories. Cigna was rated as being significantly more lively than Humana, while Humana scored much higher on appearing informational.

Brand Perception

We split the participants equally between brands and asked them to rate their perception of the brand before and after their experience with the website.

Brand Perception Pre-Task

In order to get a feel for the brand before their online experience, we showed participants an image of the homepage.

For the rating scale: 1 = Very Negative, 4 = Neutral, and 7 = Very Positive.

Both pre 3
Before participants interacted with the website, both had an average rating of about 5 out of 7 or slightly above neutral.

Brand Perception Post-Task

After participants interacted with the website we again asked them to rate their perception of the brand with the same rating scale: 1 = Very Negative, 4 = Neutral, and 7 = Very Positive.

Both post

Both brands saw a slight decrease in their average brand perception rating after participants interacted with their websites, but where Humana’s average only dropped by 0.2 points, Cigna saw a decrease of 0.8 points with 9 out of 26 users rating the brand negatively versus Humana’s 5 out of 26.

Task Outcomes

In order to validate whether users were successful at getting a quote for the monthly cost of preventative dental insurance, we asked them what the cost was. If users answered with the correct cost they were labelled as Success.

Non Success meant that a user either Abandoned the task due to difficulties with the website, or worse, thought they had found the correct price but chose the incorrect answer, which we labelled as Error.

When we rang the final bell and tallied up the points it was a split decision. Amazingly, both sites had exactly the same overall Success and Non Success rates.

Success

Despite the same number of users answering correctly, there was a noticeable difference in the amount of time, page views, and clicks it took users to be successful on each site.

Both success

User Session with Audio

On average it took Humana’s successful users almost a minute less to get a quote than it did on Cigna’s website.

By watching the following user sessions that are representative of the average time on task for each site, we can see that Humana’s website has less forms to fill out in order to get a quote, which is likely why it took users less time and less clicks.

Humana user session (video only)

Cigna user session (video only)

Ease of Use

We asked users who successfully completed the task to rate how easy or difficult it was to accomplish, with 1 = Difficult and 5 = Easy.

ease of use both

Despite the differences in time on task, clicks, and page views both brands got a similar average rating of just over 3 from their users.

Non Success

What’s interesting is that while both sites had similar abandon and error counts, and nearly identical time on task, page views and clicks for users who answered incorrectly, Humana’s users abandoned the task after only 3 page views and just over a minute of browsing whereas Cigna’s users stuck it out almost three times longer to try and accomplish the task before giving up.

Non success both 2

Watching the user sessions that abandoned the task on both sites, it’s apparent that a majority of users abandoned after being asked for their date of birth and gender as required fields to get a quote, even if no other information was required other than a zip code.

Humana user session (video only)

Cigna user session (video only)

Net Promoter Score (NPS)

After participants interacted with the website, we asked them to rate how likely it was they would recommend the site to friends, family or colleagues. Users who rate this likelihood as low, 0-6 on the rating scale, are labelled as Detractors. Users that choose 7 or 8 are labelled as Passives and Promoters are users that rate the likelihood as 9 or 10.

NPS - both

In our seven years of experience, we have found that most brands have negative Net Promoter Scores and that the average NPS differs by industry. What’s noticeable about these results are that Cigna’s users were half as likely to recommend the site than Humana’s users.

Conclusion

Both websites performed well, with similar success rates and feedback from participants. Overall, this group of participants preferred Humana slightly over Cigna, with Humana’s successful participants taking less time and fewer clicks than those who were successful with Cigna.

The main take away is that Humana’s website was more intuitive for users to use than Cigna’s, resulting in less time, less clicks, a higher ease-of-use rating and a higher NPS from participants. Which goes to show you that success rates are just a piece of the puzzle when it comes to user experience – optimizing the user journey requires going beyond success and error rates and understanding from start to finish what’s working and what can be optimized.

Want to learn more about
UX Benchmarking with UserZoom?

xperience360-button