Overhauling a legacy registration process
Earlier this year, I conducted usability testing in preparation for a redesign of the event registration system we've had in place for over five years.
Despite high satisfaction scores (90%+), our registration journey had issues. High drop-off rates plagued the first step of the process, while users complained about the long, drawn-out forms throughout the following pages.
- Find and fix the drop-off rate issue on the first page
- Position new design in direction of more journey-like process
- Uncover any areas of difficulty during registration sessions
I started by working with our developers and digital marketing team to set up Hotjar on our registration forms. Due to some legacy code, I wasn’t able to use the form analysis feature. I set up recordings and watched a sample of about 100 user interactions, which ended up being more helpful.
Users had a difficult time understanding what was needed on the first page due to all fields being marked as “not complete” and changing once completed. Additionally, all fields were lined up side by side down the entire page and weren’t grouped by category. Users struggled to efficiently fill out this page and would often have to go back to fix errors.
Wireframing a solution
I presented these findings to our registration, customer experience, and technology teams. I shared examples of each issue and drafted potential solutions. They immediately saw the potential of each improvement and provided feedback based on their experiences.
Organizing form fields
Users breezed through the form, but missed a lot of required fields. I believed the issue was two-fold. They missed fields because some were off to the right and each category of fields was not properly grouped. I fixed this by adding a gray background behind the “name” fields, for instance.
Removing visual noise
Each field started with a “required” asterisk next to each text input. Every field was required except for two. I suggested that these two fields be marked optional and the rest be plain to remove clutter.
The current process had a progress bar that showed only the percentage complete. Navigation at this point only consisted of a “continue” and a “back” button at the bottom of the page. I proposed a progress bar at the top that included the title of forthcoming pages while indicating the active page. This would also allow users to jump to pages in the process rather than navigating one at a time.
Delivering to developers
Based on the feedback I had received from partner groups, I shared wireframes with our event team and association management involved in our test event. Incorporating more feedback, I made final updates to the wireframe and moved forward to our developers to implement the changes.
Early comparisons with prior (similar) events show a 7% increase in total completion rate, along with a 26% decrease in drop-offs on the problem page I sought to fix. Time on this page also decreased by 22%, indicating better efficiency for users.
Further testing with an A/B approach will provide a better understanding of how these changes affect user behavior.
Satisfaction scores during research and other bits of user feedback will help us understand how these changes have impacted customer experience.