With limited time and resources, we followed a lean version of the double diamond to keep the project focused
First step was to fill in the knowledge gaps on the current experience and understand the problem in detail. How accurate were the analytics? Where were there gaps? Were there any anecdotal gaps we weren't aware of?
Essentially, we looked for evidence that could provide us with a hypothesis for why users were tapering off.
With a broad set of questions set, we started with the data. Completing an quantitative analysis via Heap revealed a key area of concern; only 60% of user that entered their email continued on to the next step of the modal.
With a limited timeframe, sourcing platform users for interviews was out of the question. So we turned to Fullstory for a qualitative analysis reviewing screen recording of users signing up. We screened a pool of 10 new users and quickly identified several task hurdles and a mountain of missed opportunities.
In summary, the current sign up process had clunky UI, mislead users with poor communication and in some cases frustrated them to rage click even basic tasks. We took note of these roadblocks and pushed on.
Before converging our findings, we looked abroad. It was important to get a baseline before acting on our research. What were the biggest and best doing? What could we learn from them?
This research was vital in identifying opportunities we never would have considered and served as inspiration.
With research complete, it was time to converge our findings. While we identified specific problems, we grouped our findings into 2 insights.
We found that the current UI design was creating a poor experience for sign ups and in some cases, blocking them from completing tasks. For example, when a password was entered, error messages were presented one at a time and only as they occurred instead of presenting all password conditions. This meant users could go rounds and rounds of errors, resulting in rage clicking.
Another example was poor mobile design that made it extremely difficult to select fields without accidentally selecting others.
While the taks flow was theoretically simple, the experience and delivery of these tasks was disjointed with users being thrown back and forth through webpages. A good example of this was when completing the sign up, and the confirmation/thank you page generated on a new browser tab. While not completely damning, this still left some users clicking wildly and unsure of next steps.
With the problems clearly defined and many opportunities identified, we needed to choose what would give us the highest value for the lowest effort.
We wanted to get a high level view of what could be feasibly achieved before committing to a design scope. So we created several task flows; one of the existing flow, another one that only solved current problems, no bells or whistles, then an ideal flow with everything we wanted to achieve.
As always the answer was some where in between these flows. However, this exercise gave us a clear direction and we moved into straight into mid-fidelity designs. Its here where we started to look at details. IE changing out user type selection from a drop down to more visual interaction to increase engagement and improve the experience of personalisation.
Another example was the exploration of dividing up the form fields into each of their owns steps to make the tasks as simple as possible and mobile friendly
With a UI structure completed, we started on adding our design system and workshopping content. This involved creating illustrated icons to represent our user segments and redesigning our pricing modal (parallel project)
Design now completed and a Figma prototype created, we also started internal user testing and made some informative discoveries. Pricing stood out as a surprising roadblock. Initial designs reflected a value driven approach where we tried to move attention away from a direct comparison of subscription plans. There was alot of information for a user to digest in the context of a sign up and we felt it would only slow the user down in getting onto the platform.
However testing showed that the our first design solution was confusing to users and didn't convey the value we intended. This led us back to a more traditional price comparison layout but we pulled back some content to reduce cognitive overload and users note engaging.
In summary, we tested across several teams including development, sales and marketing. Working remotely we asked them to simply sign up and talk us through their thoughts recorded via Loom.
As well as internal, we asked some external users who didn’t interact with our product in anyway for a more unbiased view. All testing revealed valuable insights that were taken into account and applied to the final outcome.
See more colorful things here