Pratt had launched their redesigned site in the summer of 2022. They were concerned the application process for both undergraduates and graduate admissions was not successful in making information easy to find between departments and subsequently not encouraging prospective students to begin their applications. Moderated Usability Testing with eye tracking was used to assess this concern. While the new site is visually attractive, we did identify four areas of usability improvement to help prospective students find the information they need to start their application to Pratt.
We spoke with Sarah Hromack and Alex Weiss Hills, our client representatives from Pratt’s Digital Communication and Marketing Team. During that conversation we were able to understand the background of the project and the needs Pratt was facing.
We learned the site is meant to act as a recruiting tool to teach prospective students about programs and encourage them to apply to Pratt. The reason for conducting this usability study was that internal stakeholders were worried the newly designed site was preventing prospective students' ability to learn about programs and apply to them. This concern affects both undergraduate and graduate programs on the Pratt site. With that issue in mind, we began crafting our research study.
Understand how prospective graduate students interact with Pratt’s desktop website to identify the pain points experienced while navigating through the application process so that we can help them find relevant information that encourages them to apply to Pratt.
With that objective in mind we created four questions to guide our research:
Moderated Usability allowed us to conduct a task based test with participants at Pratt Institute's usability lab. We asked our participants to imagine themselves in a scenario and complete 4 tasks on the Pratt site. We were able to observe our participants' actions and difficulties navigating the site in real time.
Using Tobi II eye tracking software we recorded participants fixations and saccades. The data helped our team understand where visual attention was focused on a page and track its movements. The data collected included gaze replays, gaze maps and heat maps, all of which were used to support the qualitative findings that arose during the usability session.
We ended our Moderated Usability Testing with an RTA. RTAs let participants retroactively comment on their session by watching a replay and walking the moderator through their thought process. This method helps participants focus on completing the tasks as naturally as is possible in a lab, and keeps their eyes on the screen during the session so the eye tracking software can pick up every movement.
I volunteered to be on the research ops team. We were responsible for creating and distributing recruitment screeners along with managing the participant database for the study.
Two teams focused on recruiting prospective undergraduate students for the desktop and mobile sites and the other two focused on recruiting prospective and graduate students for the desktop and mobile sites.
Participant requirements included:
We received 49 initial responses and felt very hopeful about our participant pool. Unfortunately 19 of the 49 who responded to the recruitment survey were unable to attend an in person session.
Once it came time to schedule sessions, we had multiple potential participants cancel. Between the two graduate population groups we had a total of 5 participants who attended sessions as a result of our outreach efforts.
The team reached a point of needing to loosen our participant requirements in order to have the time to run the sessions and analyze the data to stay on schedule for our client presentation date.
We attribute the poor recruitment experience to:
We were able to recruit 8 participants, 7 of which matched our original criteria for the graduate desktop population. We secured these participants by reaching deep into our personal networks.
While recruitment occurred we prepared our test materials including Consent Form, Pre-Test Questionnaire, Scenario and Tasks for our sessions, SUS Form to be distributed after the task portion of the test and our Test Script.
You are interested in attending graduate school and are exploring a couple of programs on Pratt Institute's websites to become familiar with the application process and figure out which program fits your interests.
Our team analyzed our data by creating a breakdown of notable information such as task completion times, observed difficulties, qualitative data from the RTAs and participant quotes. We then placed all our insights onto a board and collectively created an affinity map which helped us identify recurring issues. We discovered a total of 7 usability issues and assigned them a severity rating.
Two themes emerged from our analysis, element discoverability and misleading calls to action. While Pratt’s new site is clean and seemingly well organized, participants displayed behaviors that matched our internal stakeholder’s concerns. While most participants were able to complete the tasks on the site, some did so with great effort. We decided to focus on our top four most severe findings.
When searching for the Communication Design MFA our participants navigated to the Program page. The Program page contains a list of all undergraduate and graduate programs on the same page under two tabs.
50% of our participants took 3+ minutes to find the Communication Design MFA program on the list. Most of that time was spent scanning or navigating away and then looping back to the program list. Once they found the graduate tab, participants were able to quickly find the information we asked them to search for.
"It's very small and it’s just at the top. It's easy to ignore" - Participant
To improve these usability issues we recommend combining element visibility with standard tab styling that more closely fits a user mental model.
Changes we recommend:
We asked our participants to find out if the Communication Design MFA application required a portfolio. 63% had difficulty with locating portfolio requirements. Participants would navigate to the Program page, make their way to the Graduate Admissions page and spend time scrolling up and down that page attempting to locate the information. 38% of our participants actually navigated away from the Graduate Admissions page after fixating on the portfolio header.
This behavior can also be seen in a heat map of the application requirements page. A lot of attention is given to the portfolio header but much less attention on the link within the paragraph below the header.
This behavior indicated to our team that the header was pulling participants attention, but the link did not stand out enough from the paragraph text for the participants to understand it was a link, and not just text.
To improve this issue we recommend:
When asking participants to begin their application, our users mainly took two paths. The first path they would navigate was to Graduate Admissions, make their way to the How to Apply page and find where they can submit their application on that page.
The second path they would take was to their Graduate Program of Choice, they would then click the Apply to Pratt button, then be led to the Graduate Admission page.
We witnessed our participants looping in circles from the Graduate Program page and Graduate Admissions page.
Additionally, during the RTA multiple participants voiced their expectations when figuring out where to submit their application.
"I wish the button ‘apply to the program’ led you directly to the application form, I thought it would be helpful" - Participant
"I was taken to another ‘how to apply’ screen, I sorta had to navigate to try to find the application, to actually submit my application" - Participant
From the behavior witnessed and the RTAs, we identified a disconnect between where our participants expected to be taken when navigating the site and where they actually landed.
To match our users' mental model of where the Apply to Pratt page navigates to, we suggest simplifying the user flow.
To improve these issues we recommend:
The heat map and gaze map of the How to Apply page show our participants heavily relied on the left navigation. The left navigation menu is full of actions, while further down the page, the submit your application section is getting far less attention.
Our team interpreted this to mean the left navigation acted as an indicator of what users would expect to find on a page.
While Submit Your Application is located on the How to Apply page, it does not appear as an option in the left navigation. Those relying on the left navigation didn't scroll down the page to actually find where they should submit their application.
To improve this issue we recommend:
Note:
We feel it would be beneficial to get Scroll Map data for this page. The heat map is suggesting users are not scrolling far enough down the page to find where they start their application online. By confirming that a low percentage of users are not scrolling past the fold, this could indicate a need to reorganize the information architecture of the page.
After not finding the information on the program page, they would go search elsewhere on the site, creating a greater sense of effort for the participants.
Participants would scan the entire page for information, but typically only read headers and skim large paragraphs of texts.
Additionally, we see from multiple heatmaps of Program pages that participants were scrolling to the end of the page. This indicated to us that they were indeed interested in all the content on the page and learning more about the program.
Our participants' behavior suggested they expected a Program page to be a one stop shop for all information related to that program, including not only an overview of the program, but also what materials they needed to apply to that program and more specific details such as what their education in the program would be like.
To improve these issues we recommend:
"I really enjoyed this presentation, I thought the layout of it was really great." - Alex Weiss-Hills, Senior Web Developer at Pratt Institute
"I thought your negotiation with the apply user flow was really useful to us...it's interesting to take a look at data that kind of shows where that might have succeeded, and, moreover, where it might still require some work." - Sarah Hromack, Media Strategist at Pratt Institute
We presented our findings to our client via zoom. Overall our client was happy with the research done. Given we presented after three other teams on the same website, some clear usability trends had emerged between the four research teams. Being able to spot trends across different places and interfaces will certainly help steer the Pratt's Marketing and Communication team in the right direction when deciding which issues were most critical for them to address first.
Our biggest challenge during this study was participant recruitment. I feel, in theory, the fixes are very simple. Offer a higher incentive and run the study at a time that does not coincide closely with the student's finals. This second fix is niche to this project as a whole, because the population needed for a study will only sometimes be students with these specific time constraints.
Lastly, the client had some thoughts about when the user might want to read more thoroughly through the site and not skip those large paragraphs of text. As a next step we suggested doing more research into exploratory users (users who are just starting to explore graduate school options). I would recommend a qualitative study method as that user group is not as heavily task based in their use of the Pratt site.