The OneStop portal allows Maryland citizens to submit applications online, providing them a one-stop-shop to submit their business, personal and recreational applications. Over the course of three years the site went from receiving 20,000 applications a year to 300,000; a number that grew even more as the portal became the hub for COVID-19 Relief Grant's, COVID-19 vaccine registration and medical marijuana licenses. With this influx of users, we saw an increase in support tickets. This was an opportunity for us to not only understand what issues our users were experiencing and but to try and fix them.
The support team was at the forefront of user interaction and reached out to UX for help. The majority of users reaching out had never used the platform before, and had various issues using it. We saw this as an opportunity to focus on new users so that we could enhance the portal to be easy to use the first time, and to encourage users to come back and use it again. With that, we decided to make first-time users the center point of a usability study of the portal.
We knew that there were many different directions we could go in at this point and we wanted to ensure we were making an informed decision before selecting a path. We held information gathering sessions with four client solutions teams, our customer support team, and analyzed help tickets.
These sessions aimed to collect information relating to the overall experience and pain points our end-user facing groups had seen and experienced. The client solutions sessions were related to the pain points for users submitting applications for their department while the customer support session focused on the end-users for the entire portal.
Customer support supplied us with over 100 emails and tickets from end-users. Three of the biggest pain points we saw were related to application requirements/clarification, application status updates and application corrections.
After our sessions and analysis, we consolidated our notes and created an affinity diagram. We voted on what we assumed to be the biggest risk for end-users when using the portal and used those assumptions to create our testing guide.
We used the videos and notes to analyze the data from these tests. A task analysis was conducted for each test, categorizing tasks 2-7 as independent completion, completed without critical errors and critical errors committed. Based on these results, we identified four areas for improvement:
The recommendations for this effort were split into two categories: technical solutions and content and visibility guidelines.Technical solutions included making the verify message more noticeable, moving "Apply Online” after the instructions and enabling WYSIWYG on profile pages to allow for visual hierarchy in content heavy sections. Content and visibility guidelines included creating placeholders & tooltips on the application for things like dates, min/max characters, min/max number and writing clear validation errors that specify what is needed to validate .
We created prototypes that updated the following areas:
We conducted unmoderated A/B testing with two prototypes. The flow, verification message and scenario were exactly the same. One group had a sticky apply button, while the other did not. Overall, the results were positive. No users used or noticed the sticky button, so it was not included in the final design.
After testing we made changes to the prototype and began writing user stories- the solution was included in Q3 2020 sprint planning, and was implemented in Q4 of 2020. In Q2 of 2021, Customer Support notified us that there had been a 27% reduction in help tickets related to account registration.
In February 2021 the site saw a mass increase in new users when it became the host for pre-registration for the COVID-19 vaccine. The site saw a traffic increase to over 2 million visitors a month, with about 1 million applications in the span of two months. The number of tickets related to registration declined in total percentage compared to Q1 of 2020.
This effort was one of the first large-scale usability efforts conducted by the newly expanded UX team and was a catalyst for many efforts going forward. Along the way, there were moments of miscommunication that caused tension. This led the team to reevaluate our communication, feedback and expectations. While it was uncomfortable in the moment, it ended up starting an effort that set feedback guidelines, expectations and communication tips- artifacts that became excellent guides as critiques expanded beyond our team.
The pool of participants did not represent our ideal audience. Recruitment was a struggle and we did not get many participants that were not employees. Employees, while more readily available, showed biases to reflect the site in a more positive manner. It led our team to discuss and set clearer guidelines on who to test with and what to test them on. This effort was not only important because of the improvement on the site, but in helping set the foundation for the UX team teamwork and communication.