CONTEXT

The OneStop Portal allows millions of Maryland citizens to apply for state applications online. As the portal became the host for COVID-19 Relief Grant applications, the site saw a massive increase in users. As the number of users grew, so did the number of help tickets support received. With support working round the clock to respond to user's issues, they didn't have the time needed to identify the root causes of why users were reaching out. They contacted the UX team for assistance and UX began an effort to fully understand these issues and identify solutions.

DISCOVERY

Usability study for first-time users of a government portal

The OneStop portal allows Maryland citizens to submit applications online, providing them a one-stop-shop to submit their business, personal and recreational applications. Over the course of three years the site went from receiving 20,000 applications a year to 300,000; a number that grew even more as the portal became the hub for COVID-19 Relief Grant's, COVID-19 vaccine registration and medical marijuana licenses. With this influx of users, we saw an increase in support tickets. This was an opportunity for us to not only understand what issues our users were experiencing and but to try and fix them.

The support team was at the forefront of user interaction and reached out to UX for help. The majority of users reaching out had never used the platform before, and had various issues using it. We saw this as an opportunity to focus on new users so that we could enhance the portal to be easy to use the first time, and to encourage users to come back and use it again. With that, we decided to make first-time users the center point of a usability study of the portal.

DATE:
DECEMBER 2020
TYPE:
USABILITY STUDY
ROLE:
RESEARCH LEAD

Discovery
Research

We knew that there were many different directions we could go in at this point and we wanted to ensure we were making an informed decision before selecting a path.  We held information gathering sessions with four client solutions teams, our customer support team, and analyzed help tickets.

Information gathering sessions

These sessions aimed to collect information relating to the overall experience and pain points our end-user facing groups had seen and experienced. The client solutions sessions were related to the pain points for users submitting applications for their department while the customer support session focused on the end-users for the entire portal.

Client Solutions Guiding Questions
  • What reoccurring problems do you hear about from your clients?
  • Roughly, how many users submit applications for this program?
  • What is the workflow, from start to finish for this programs applications?
Customer Support Guiding Questions
  • What reoccurring problems do you hear about from your users?
  • Roughly, how many help tickets do you receive a week?
  • What are the top reasons users reach out?

Help tickets

Customer support supplied us with over 100 emails and tickets from end-users. Three of the biggest pain points we saw were related to application requirements/clarification, application status updates and application corrections.

After our sessions and analysis, we consolidated our notes and created an affinity diagram. We voted on what we assumed to be the biggest risk for end-users when using the portal and used those assumptions to create our testing guide.

Using the assumptions generated as our guide, we created a research plan. We decided to conducted a usability study with first-time users of the site.
RESEARCH

Methodology

Participant outreach

Two groups of participants were identified for this test:
  • New employees- employees we had been working less than two weeks and had not used the portal.
  • Friends & family of current employees who has submitted a state application- messages were sent in slack asking if any employees had family or friends that would be interested. Potential participants were sent a screener to identify fit.

Screener criteria

  • Have submitted a license, permit, registration or application within the past 5 years.
  • Not be an employee of Maryland state.
  • Be comfortable submitting secure information online.
  • Be comfortable being recorded.

Participant breakdown

Participants Employment
Participants Submission Experience

Testing Set-up

30 minute sessions were scheduled with each participant. Participants were told they needed to submit a permit to rent a park for an event they were planning on hosting and had found a link for the application on a state website.

Tasks

There were seven tasks total. Participants went through the initial submission process and gave feedback on the portal.
ANALYSIS

Results

General

We used the videos and notes to analyze the data from these tests. A task analysis was conducted for each test, categorizing tasks 2-7 as independent completion, completed without critical errors and critical errors committed. Based on these results, we  identified four areas for improvement:

  • Message for verification of account:One task had users verify their account before submitting the application. This is required of the system and users cannot submit an application until they verify. Of the 7 users that committed critical errors, 6 either did not notice the banner or assumed it was a success message.
  • Navigation after registering: Participants were taken  to the homepage after registering- 55% of participants assumed they would remain on the application page after registering. When trying to find the application again, 78% did not remember which agency they had come from.
  • Profile Page layout: 66% of participants did not like the density of content that was displayed on the top of the form profile page, with one participant commenting they "felt a lot of pressure when I first entered the page." 66% of participants also noted that the “Apply Online”button would be better in another location.
  • Submission page navigation & validations: After finishing filling out the application, participants were navigated to a review page labeled "All Pages"- 77% of participants said this was unexpected. A min/max field validation was included on the application; 55% of users commented that they wish they had been told before hand what the min/max value was for a field so they could have avoided the error (this information was included on the detail page of application).

Usability Metrics

Recommendations

The recommendations for this effort were split into two categories: technical solutions and content and visibility guidelines.Technical solutions included making the verify message more noticeable, moving "Apply Online” after the instructions and enabling WYSIWYG on profile pages to allow for visual hierarchy in content heavy sections. Content and visibility guidelines included creating placeholders & tooltips on the application for things like dates, min/max characters, min/max number and writing clear validation errors that specify what is needed to validate .

A/B Testing with prototypes

We created prototypes that updated the following areas:

  • Verification notification. This was the task with the lowest success rate. It originally was a green banner, which made some participants think it was a success login. We changed it to a page with a message instructing users to check their email.
  • Collapse and expand of sections. Participants commented on the overwhelming amount of content when first navigating to a detail page. We re-designed this be displayed as collapsed.
  • Updated layout. Originally the apply now button was under the information section. Four participants said this didn't feel like a natural progression, so we updated the apply option to be after the instructions.

We conducted unmoderated A/B testing with two prototypes. The flow, verification message and scenario were exactly the same. One group had a sticky apply button, while the other did not. Overall, the results were positive. No users used or noticed the sticky button, so it was not included in the final design.

With sticky button
Without sticky button
IMPLEMENTATION

Validation: fewer support tickets

After testing we made changes to the prototype and began writing user stories- the solution was included in Q3 2020 sprint planning, and was implemented in Q4 of 2020. In Q2 of 2021, Customer Support notified us that there had been a 27% reduction in help tickets related to account registration.

In February 2021 the site saw a mass increase in new users when it became the host for pre-registration for the COVID-19 vaccine. The site saw a traffic increase to over 2 million visitors a month, with about 1 million applications in the span of two months. The number of tickets related to registration declined in total percentage compared to Q1 of 2020.

POST MORTEM

Bonus: team process improvements

This effort was one of the first large-scale usability efforts conducted by the newly expanded UX team and was a catalyst for many efforts going forward. Along the way, there were moments of miscommunication that caused tension. This led the team to reevaluate our communication, feedback and expectations. While it was uncomfortable in the moment, it ended up starting an effort that set feedback guidelines, expectations and communication tips- artifacts that became excellent guides as critiques expanded beyond our team.

The pool of participants did not represent our ideal audience. Recruitment was a struggle and we did not get many participants that were not employees. Employees, while more readily available, showed biases to reflect the site in a more positive manner. It led our team to discuss and set clearer guidelines on who to test with and what to test them on. This effort was not only important because of the improvement on the site, but in helping set the foundation for the UX team teamwork and communication.