Optimizing Ticket Reservations on Mobile:
Assessing Information Flow with Usability Testing and Eye-Tracking Insights
Background
Scope
Project Timeline: 6 Weeks
Role/Contribution: UX Researcher
Team: 4 UX Researchers, Project Advisor
Methods: Moderated Usability Testing and Eye-Tracking, Surveys, NN/g Severity Rating Scale, System Usability Scale
Tools: Figma, Miro, Tobii Pro Lab, iMovie
Participants: 8 Museum and Art Enthusiasts
Problem
The Cooper Hewitt Smithsonian Museum website serves as a gateway for visitors to explore the museum’s art collections and exhibitions, as well as plan their visits by reserving tickets. While the site offers a wealth of content, the primary focus of the project was to enhance user experience in order to drive conversion.
I worked in a team of 4 UX researchers to study user journeys and behaviors on mobile. We targeted ticket reservations, navigation of online exhibitions, footer information as the primary areas of focus. This project in particular highlights our team’s ability to lean into scrappy research methods, balancing quality, speed, and prioritization for impact.
Goals
Streamline Ticket Reservations:
Identify and address any pain points to ensure an intuitive and user-friendly flow for ticket reservations.
Assess User Attention:
Ensure content is clearly presented on the site so that users are able to retain information after viewing.
Call-to-Action Emphasis:
Ensure call-to-action elements are well positioned within the website and accessible to users when needed.
Identify and address any pain points to ensure an intuitive and user-friendly flow for ticket reservations.
Assess User Attention:
Ensure content is clearly presented on the site so that users are able to retain information after viewing.
Call-to-Action Emphasis:
Ensure call-to-action elements are well positioned within the website and accessible to users when needed.
Process
Summary
Our process began with a client kickoff meeting to establish clear objectives, ensuring alignment on goals and priorities, followed by craftinga research plan tailored to these objectives. Next, we recruited participants who matched our target user profiles, scheduled onsite sessions with them.
During each particpants session, we calibrated our software and made sure they were comfortable and provided to give out tasks and observed how participants navigated those tasks.
Post-testing, we analyzed and synthesized the data, distilling key findings that highlighted user needs and pain points.
Finally, we developed actionable recommendations grounded in our research and presented them in a report and presentation, offering the client strategic insights to drive visitor conversion through a smoother user experience.
During each particpants session, we calibrated our software and made sure they were comfortable and provided to give out tasks and observed how participants navigated those tasks.
Post-testing, we analyzed and synthesized the data, distilling key findings that highlighted user needs and pain points.
Finally, we developed actionable recommendations grounded in our research and presented them in a report and presentation, offering the client strategic insights to drive visitor conversion through a smoother user experience.
Key Questions
“ What is the order of information processed by participants on the site? Does it match the intended user flow? ”
“ Are there any design elements confusing to the user? Are they diverting participants from reserving tickets? ”
“ What design elements are often overlooked by users? How can we highlight elements to help guide them? ”
Participants
Art Enthusiasts who Frequent Museums
Since our clients were interested in exploring how they could increase conversion through engaging their users, we defined our user personas as individuals seeking resources on the website and those generally curious about the museum.
Consequently, we chose to recruit participants within the young adult age group, who are more likely to be interested in museum exhibitions and activities, with “have visited museums before” as a core requirement.
Methodology
Eye-Tracking Test
To figure out if anything was getting in the way of users finding information and reserving tickets, we decided to go with eye-tracking as our main testing method. Setting up Tobii eye-trackign software was a super effective way to see exactly where users look, how long they focus on different elements, and the path their eyes follow on the screen.
Once our participants were calibrated and comfortable, we guided them through 4 tasks. We encouraged them to take their time and simply raise their hand when they felt they had completed each task. Each task was thoughtfully designed to mirror real user flows that aligned with our research goals, focusing on how people navigate exhibitions, explore the museum's history, and locate ticketing information.
Afterwards a post-test walkthrough, or Retrospective Think Aloud (RTA) was conducted as with the participants were asked to think and talk aloud their process of performing the tasks while watching a replay of their process.
Once our participants were calibrated and comfortable, we guided them through 4 tasks. We encouraged them to take their time and simply raise their hand when they felt they had completed each task. Each task was thoughtfully designed to mirror real user flows that aligned with our research goals, focusing on how people navigate exhibitions, explore the museum's history, and locate ticketing information.
Afterwards a post-test walkthrough, or Retrospective Think Aloud (RTA) was conducted as with the participants were asked to think and talk aloud their process of performing the tasks while watching a replay of their process.
Example of gaze replay: Participant looking through the contents of an exhibition.
System Usability Scale (SUS) Surveys
At the end of the eye-tracking test, we wanted to gather additional insights while participants still had their fresh impressions and feelings about the website. To do this, we deployed a 10-question survey designed to capture their overall experience.
The survey not only provided a chance to gauge their sentiments about key aspects of the site but also served as an extra layer of quantitative data to complement the qualitative insights we had already gathered from the eye-tracking sessions. By combining these two data sources, we could get a more complete picture of how users interacted with the website and how they felt about it as a whole, allowing us to uncover and further validate some points brought up during the testing sessions.
NN/g Severity Rating
After gathering ample data from testing and surveys, we transitioned into the synthesis phase to organize and analyze our findings. In addition to qualitatively sorting insights, we applied Nielsen Norman’s Severity Rating to systematically prioritize issues based on their impact and frequency. This method provided a clear framework for identifying the most critical problems, helping the direction of prioritization when addressing issues.
With 4 hours of research notes and observations to process, we divided the workload among team members, assigning each person 2 participants to analyze. This collaborative approach allowed us to work efficiently, cutting down individual analysis time before regrouping to consolidate insights.
The use of the Severity Rating not only streamlined prioritization but also made it easier to synthesize and manage the large volume of findings, helping us move from raw data to actionable recommendations.
Findings
Overall Sentiment
After reviewing feedback from all eight participants, we found that cooperhewitt.org has a mean SUS score of 58.4, slightly below the benchmark of 68. Looking more closely at Usability and Learnability scores, the site performed well in learnability but lower in usability. This aligns with the overall impressions shared by our participants. We observed that the website’s language and information architecture were generally intuitive, allowing participants to navigate with relative ease. However, when asked to complete specific tasks, participants experienced minor difficulties.
This indicates that while the website supports initial exploration and understanding effectively, there are opportunities to improve task-specific workflows to make actions feel more seamless.
Content Overload, Unclear Titles
One of our tasks asked our participants to find information about the Cooper Hewitt Museum, focusing on the footer of the site. 7/8 participants failed to complete this task.
As seen from the heat map we rendered through Tobii, we can see that the attention to details for most participants within the footer section was at the very top of the footer. To combat this, repetitive or less important content should be removed to reduce a diversion of attention as users navigate the footer.
Furthermore, the correct place to end up in for the task, labeled “Welcome to Cooper Hewitt” did not resonate with the mental model of our participants.
Changing the title to “About Cooper Hewitt” which matches a more consistent design pattern with most website and will communicate to the user clearer.
Call-To-Action Elements are Difficult to Find
From that we found that each exhibition page uses the same layout, where call-to-action buttons such as “Purchase Tickets” are placed in at the bottom of the screen. Depending on the length of content on a page – which most exhibition pages are long – users might or might not see the call-to-action to reserve tickets .From our task results, only 3/8 participants scrolled to the bottom of the screen and used the “purchase tickets” call-to-action link. One participant even said: “I was looking for a button that would allow me to buy ticketsbut I couldn't find one.” eventhough they haven’t scrolled to the bottom of the page.
To combat this, placing all important call-to-action before the fold will allow the user can easily access them if needed.
Streamline Information Visually, Top Down.
When tasked to reserve a ticket only 5 out of 8 participants were not able to quickly decide a date and time for their visit. They struggled to recognize that they had reached the correct page where they could select a date and time for their ticket purchase.The current layout of the "Reserve Tickets" page can seem a little confusing to the user as it presents a lot of information at once, requiring users to swipe multiple times to reach the section where they can choose their desired time slots.
Additionally, the "Calendar" page does not follow a design pattern that its name suggests. A top down information architecure that features a traditional calendar will help streamline this process as users will be able to quickly understand that they are in the process fo choosing a date and time for their visit.
Current view of the calendar view does not contain a visual calendar.
Users Struggled to Return to Main Site
A task given to participants asked them to find a digital exhibition on Cooper Hewitt’s Digital Exhibition microsite. For this task, 7/8 were able to find it using the navigation bar however once they have finished scrolling through the content, we asked them to return to the homepage of the main website.Without clear notification, users may become frustrated when redirected to a different site espeically if they want to be directed back to the previous page or home page.
To address this, adding some notification or banner at the top of the Digital Exhibition page can clearly inform users that they are in a microsite. The banner should also provide an easy way for users to return to the main Cooper Hewitt Museum homepage.
Participant rage clicking as they could not get back to the main website.
Impact
Implemented Changes on cooperhewitt.org! Updated Nov 2024*
1. Website footer is now streamlined. “Welcome to Cooper Hewitt” is changed to “About Cooper Hewitt”.
2. Call-to-Action is no longer hidden at the bottom of the page, and is placed after the exhibition description.
3. Ticket reservation page now includes a traditional calendar view that allows users to quickly pick a date and time.
4. Digital Exhibition microsites now redirect back to the main website, though a banner isn’t implemented.
Reflections
This project offered a great opportunity to strengthen qualitative methods with quantitative data. Using Tobii Pro Lab, our team gained valuable insights into eye tracking, including the technical aspects of setting up and conducting tests on mobile devices.
The team at Cooper Hewitt were impressed by the insights we presented, and were excited to dive deep into more of the insights. It was especially rewearding to see the museum apply our research findings to improve the site.
Overall, this project was a successful and productive experience that enhanced both team collaboration and communication skills.
The team at Cooper Hewitt were impressed by the insights we presented, and were excited to dive deep into more of the insights. It was especially rewearding to see the museum apply our research findings to improve the site.
Overall, this project was a successful and productive experience that enhanced both team collaboration and communication skills.