The Task-Based UX Benchmark Study

UserZoom ran a quick task-based UX benchmark study between HP and IBM, two well-known technology corporations, to compare the experience of researching business solutions on their site.

  • We ran the unmoderated remote task-based benchmark study with 60 in-the-wild users on their own devices over the course of a single day
  • We split participants equally between each website so that 30 went to HP and 30 went to IBM
  • The Online Task: From the homepage, navigate to the page with an overview of all retail business solutions
  • We validated the task via URL validation

conclusion anchor button

Let’s Get Ready to Rumble!

We showed all 60 participants an image of HP’s and IBM’s homepage and asked them to choose which site they associated with several UX attributes.

HP v IBM desktops

UX Battle of the Week: HP vs IBM brand attributes

Participants positive initial reactions leaned more towards HP’s homepage, particularly when it came to being seen as Welcoming, Easy and Lively. IBM was largely seen as being Overwhelming. Interestingly, both websites practically tied in regards to being seen as Informational.

Screenshot Click Test

We split the participants evenly and asked where they would click if they needed support.


UX Battle of the Week: HP vs IBM: HP heatmap

74% of participants correctly clicked on Support. 57% clicked on Support in the header while 17% clicked it in the footer.


UX Battle of the Week: HP vs IBM: IBM heatmap

70% of participants correctly clicked on Support. 43% clicked on Support Portal while 27% clicked on Support in the footer.

Online Task Outcomes

In order to validate whether users were successful at finding the retail business solution overview page, we validated the task with URL validation. If users got to the correct page they were labeled as Success.

Non Success meant that a user either Abandoned the task due to difficulties with the website or said they were done before reaching the correct page, which we labeled as Error.

UX Battle of the Week: HP vs IBM success rates


Users were 40% more successful on HP’s site than on IBM’s.

UX Battle of the Week: HP vs IBM - Success Efficiency

On average, successful participants on both sites were able to find the correct overview page with similar amounts of page views and clicks. Despite the difference in overall success rates, the participants who were successful on the IBM site were successful in far less time than those on HP’s site.

HP user session

IBM user session

Non Success

UX Battle of the Week: HP vs IBM non success ratios

Other than quantity, both sites had surprisingly similar metrics for their non successful participants. Both sites had more participants err than abandon, meaning users felt they had reached the correct page and said they were finished rather than give up on the website. When we watched the videos, it was apparent that a majority of users on the IBM site felt that the Commerce page was what they were looking for rather than the retail industry overview page. For HP it was generally a product page, such as printed services.

Users who abandoned did so after about two minutes of searching on both sites. Users who erred, however, tended to search longer on the IBM site than those on the HP site.

HP user session

IBM user session

Brand Perception

We split the participants equally between brands and asked them to rate their perception of the brand before and after their experience with the site.

For the rating scale: 1 = Very Negative, 4 = Neutral, and 7 = Very Positive. We also included the option for participants to say they’re not familiar with the brand.

Brand Perception Pre-Task

UX Battle of the Week: HP vs IBM brand perception pre task

Both brands started with fairly high brand perception ratings from the participants, HP with 5.9 out of 7 and IBM with 5.5 out of 7.

Brand Perception Post-Task

UX Battle of the Week: HP vs IBM brand perception post task

Both sites saw a slight decrease after participants interacted with the website, but on a whole, brand perception stayed relatively stable. HP ended with a 5.6 out of 7 while IBM finished with a 5.1 out of 7.

Ease of Use

After the task we asked all the users to rate how easy or difficult it was to accomplish, with 1 = Very Difficult, 4 = Neutral and 7 = Very Easy.

UX Battle of the Week: HP vs IBM ease of use

HP had a slightly higher ease of use rating from participants, ending with a 4.9 out of 7. IBM ended with a 4.2 out of 7.

Problems & Frustrations

We asked the users which, if any, of the following problems they encountered while on the site.


UX Battle of the Week: HP vs IBM: HP Problems & Frustrations


UX Battle of the Week: HP vs IBM: IBM Problems & Frustrations

Participants who chose “Other” were given an open-ended follow up question to explain. This is what IBM’s user said:

  • “Took too long to find what I was looking for”

Net Promoter Score (NPS)

After participants interacted with the sites we asked them to rate how likely it was they would recommend them to friends, family or colleagues.

UX Battle of the Week: HP vs IBM NPS

HP’s participants were more likely to recommend the site than IBM’s participants.


UX Battle of the Week- HP vs IBM Winner

Both sites performed well and had similar ease of use and brand perception ratings both before and after the task. The winner, however, in this week’s battle of technology corporations is HP for having higher success rates when it came to finding business solutions, having fewer reported problems and frustrations, and a higher overall Net Promoter Score.

Want to learn more about competitive
UX Benchmarking with UserZoom?

personalized demo