The Task-Based UX Benchmark Study

UserZoom ran a quick competitive UX benchmark study between BT and AT&T, two telecom service providers on either side of the Atlantic, to see how users experienced navigational and core tasks, as well as how they rate their overall experience on these websites.

  • We ran an unmoderated remote task-based benchmark study with 60 in-the-wild users on their own devices over the course of a single day
  • Participants were equally divided between the sites so that 30 went to each
  • They completed several tasks while on the site: a screenshot click test, a navigational tree test, and a task-based test where we asked them to find the mobile coverage in their area
  • We also measured their brand perception and how they rated their overall experience

 

conclusion anchor button

First impressions are lasting impressions

We wanted to know what users’ first impressions were on both of the sites. We showed all participants the BT and AT&T homepages and asked them to choose which site they associated with several UX attributes.

 

Participants felt that AT&T was more organised, easy and welcoming, and while BT was seen as more lively, it was also perceived to be overwhelming. AT&T was also seen as more trustworthy and helpful. A clear win for the US telecom giant in this round.

Screenshot Click Test

To understand if visitors were able to easily find important information, we split participants evenly and asked them to complete a simple task while looking at a static image of a homepage: “Where would you click to enter your account?”

BT

You can see that the page has a concentration of heat in the top right hand corner, with 65% of participants correctly clicking on ‘My BT’ to enter their account. Other clicks focused on the search area, or on areas such as ‘BT for the Home’.

AT&T

AT&T had a slightly higher dispersion of clicks, with only 60% of users selecting the correct option of ‘Log In’. The next highest grouping of users opted for the shop and support burger menu. A narrow win for the UK’s largest telecoms company here.

Testing Navigation: Tree Test

To understand the usability of each company’s taxonomy and menu structure, we removed all design elements and asked users to navigate via the menu structure. We split participants evenly and asked : “Imagine you are interested in tv and internet bundles, where would you click to find them?”

AT&T achieved a slightly higher success rate, with 63% of their participants correctly identifying pages within their menu structure that would tell them about tv and internet bundles. However, the difference is not statistically significant, so at this stage we’ll call this round a draw.

Further analysis shows us that AT&T may be more successful in helping users navigate their menu by repeating some of the same or similar products repeatedly within the menu. On average, it took AT&T users 20 seconds to navigate the menu, as opposed to participants on the BT site, who took 28 seconds on average. A narrow victory for AT&T here.

Online Task-Based Test

When people consider buying a new phone or changing network, it’s not just the cost and the amount of minutes and data on offer that they look at. People are also concerned with which provider offers good coverage where they live and work, as this will affect the quality of their daily lives. This is why we asked participants to find about each providers mobile coverage in their area. 

To evaluate the success of the task we used validation by URL – users that reached the correct page on mobile coverage would be considered a Success.

A Non-Success result meant that the participant decided to either abandon the task, which was marked as an Abandon, or they clicked a Successful finish when they’d yet to reach the correct page about coverage.

So, how did the two challengers do?

Success

BT’s users were marginally more successful at the task overall than AT&T’s participants. Let’s take a look at the breakdown in efficiency.

On average, AT&T’s successful participants were completing the task just a couple seconds faster than BT’s users, while they performed very similarly for page views performance, each requiring 3 unique pages on average to succeed in the task.

Non Success

Two-thirds of BT’s non successful participants erred, whiled a third abandoned. After filtering and watching back the sessions of those that abandoned, we found that most did so after getting stuck down a rabbit hole on BT’s service status pages, which tells its users about any service interruptions or technical issues, rather than give a guide on coverage.

Meanwhile, 80% of AT&Ts non-successful participants erred, while only a fifth abandoned. After watching their sessions it is clear that many users mistakenly thought they were on a mobile coverage page, when in fact they were pages relating to the availability of AT&T’s other services.

For both companies, it’s clear that clearer labelling and communication would have helped more users to be successful.

User videos

BT User Session

AT&T User Session

Ease of Use

After the tasks we asked all the users to rate how easy or difficult it was to accomplish, with 1 = Very Difficult, 4 = Neutral and 7 = Very Easy.

Participants that experienced the AT&T site felt that the task was marginally easier than those who performed the task on the BT site.

Problems & Frustrations

We asked the users which, if any, of the following problems or frustrations they encountered while on the site.

BT

Respondents that chose “Other” were given an open ended comments box. For BT, these were their responses:

  • The website overall look is messy.
  • Felt very difficult to get the info I was after
AT&T

For AT&T, the only feedback within ‘Other’ was:

  • Although I have a 21″ monitor, I had to scroll up and down and across to find information. It was frustrating

Brand Perception

We asked participants to rate their perception of the brand before and after their experience with the site.

For the rating scale: 1 = Very Negative, 4 = Neutral, and 7 = Very Positive. We also included the option for participants to say they weren’t familiar with the brand before the task.

Net Promoter Score (NPS)

After participants interacted with the sites we asked them to rate how likely it was they would recommend them to friends, family or colleagues.

This week’s group of participants were more likely to recommend AT&T’s site to friends, family and colleagues, rather than BT.

Conclusion

This week’s UX benchmark battle between two of the largest telecoms providers on either side of the Atlantic showed that despite their different markets, both providers offered similar products, users faced similar challenges, and in fact both had similar usability problems. However, there is still a clear winner. The winner this week is AT&T!

AT&T swept a majority of the categories this round, including first impressions, findability, task efficiency, ease of use and NPS. While BT held it’s own by achieving the highest task success % in the main task, a high number of successful participants reported usability issues.

How would you stack up against your competition?

Let us show you how we’ve helped hundreds of companies unlock competitive user insights with our user experience testing platform.

Schedule a Call