My role in the project
A bit of background…
legalsuper is a Melbourne-based superannuation fund that predominantly serves to the legal industry.
GMG was appointed to redo their website, and I was assigned as UX Lead on the project. I was 100% on the tools for the entire wireframing stage, and I was tasked with getting across all research and analysis conducted by the GMG Strategy Team.
As per the project scope, GMG was tasked with prototyping and user testing the wireframes before the UI and development phase. My role was to create the prototypes and conduct user testing.
WHAT WE SET OUT TO ACHIEVE
We wanted to test our wireframes and ensure the functionality was going to serve customers adequately.
We had to record all feedback from a range of existing legalsuper customers (young, middle age and close to retirement) and flag any potential issues before the UI and development phase. At this point in the project, I'd been guided by insights from the client and customer research documentation. I was yet to speak to any customers face-to-face, so I was eager to see if my wireframes were going to hold up under the scrutiny of actual users.
PREPARING FOR TESTING
I referred to a lot of material by Steve Krug on how to conduct usability testing, in particular from his book "Rocket Surgery Made Easy". Some of the critical things I needed to consider were:
- Identify what I was going to be testing
- Create a list of tasks
- Decide what kinds of users we needed
- Figure out a methodology to record and analyse the results
I worked closely with legalsuper's Manager of Insights and Strategy, who was in charge of recruiting the users for testing and dealing with them directly. The main breakdown of users was between employers and members. We also wanted to make sure that the various life-stages were represented:
- Members considering retirement
- Members wanting to get started with super
- Members wanting to grow their super
CREATING THE PROTOTYPE
All wireframes were built in Sketch, which meant I had to choose which prototyping software to use to bring it to life. I had prototyping experience with InVision before, so I decided to stick with it to avoid complicating the process. I gave direction to GMG designer, Brooke, who performed a lot of the heavy lifting in building out the interactive menus and pages.
We also created a mobile prototype to cater for mobile users. According to Google Analytics data, there was roughly a 50/50 split of use between mobile and desktop.
IDENTIFYING WHAT WAS GOING TO BE TESTED
I worked closely with GMG Strategist, Grace, who was in charge of creating the test questions. We collaborated to come up with the most important aspects of the website that needed to be tested.
Test questions were prioritised based on how frequently the average user would need to perform the task. This included:
- Accessing the home page
- Logging in to their account
- Checking fund performance
- Researching how much super is required to retire
- Contacting legalsuper
- Finding a form
RECORDING THE TESTS
Recording and analysing the results of user tests can be difficult. If you're not recording the video or audio of the test and you're only taking notes, you might miss something important. If you are recording video and audio and not taking notes, it can be very time consuming to watch each test and provide annotations.
I was going to be conducting the tests on my own in the client's office on William St, Melbourne. I decided to focus 100% of my attention on the reactions of the customer rather than trying to multitask by taking down notes at the same time.
I used my laptop to record the screen and audio, so I could watch it back later and take notes in a more controlled environment. This method proved to be successful as I was able to focus 100% of my attention on the user. I have seen it work with a note-taker and a testing facilitator. I prefer to have fewer people in the room to make it feel less formal and remove unnecessary pressure on the user.
MAKING USERS FEEL COMFORTABLE IN A STRANGE ENVIRONMENT
I learnt some lessons about how to make users feel more comfortable in a potentially uncomfortable and inconvenient situation:
- Be clear with all expectations and requirements. I made sure that I ran over what was required with each user in a high level of detail to put their minds at ease and aligned expectations from the start.
- Allow them to ask questions (no matter how dumb or seemingly insignificant). I avoided using terminology or abbreviations that only UX people would understand.
- Allow them to go on tangents. Part of empathising with users means listening to what they want to talk about — no matter how off-topic it may be. I found this to be a great way of building trust with the user over a short period of time.
ANALYSIS AND SORTING
I collaborated with GMG Strategist, Pola, on the best way to sort and categorise all feedback. I wanted a solution that would enable her to work as efficiently as possible and allow for collaboration/input with the broader team.
I recommended Miro as the option, and we utilised sticky notes and artboards to sort recorded responses from the video screen captures.
As you can see below, we initially grouped each comment or piece of feedback under each related section using affinity diagrams. We then used sticky note colour coding to mark negative comments (red), positive comments (green) and neutral comments (white). We also made sure that names were added to the sticky note so we could identify any persona trends (if any).
At this point, GMG hadn't created a standardised document or process for user testing. To improve efficiencies for future projects, I decided to work on a document structure that would be easily understood and utilised by the client. Research is only as good as the final presentation, and I didn't want to undermine our research by presenting a sloppy report.
The structure I influenced went as follows:
- Purpose & methodology. It's essential to be inclusive of all people who may be viewing the final report. Assuming that everyone knows the purpose and methods for user testing is a sure way to lead to confusion.
- Participants. We had to state the key demographics involved in the user testing to make sure all customer groups were being well represented.
- Key issues: overview. Not everyone would be interested in going through the 79-page report to identify the key issues and takeaways that needed attention. We highlighted the key issues and provided them in a 5-page overview.
- Tasks for members. We needed to state what the tasks were so we could be clear on what was successful and what needed reviewing.
- Tasks for employers. Same as the previous point.
- What we found. This was where we provided suggestions that the client could take or leave.
- Appendix (personas). As a bonus, we decided to create personas from each user we tested so they could be referenced if someone wanted to know a bit more about their background. Personas are often assumption based. These were based on actual responses from real users.
- The client was pleased with the results and validation of the design decisions we had made during the wireframing phase.
- Some usability issues were related to future-state functionality that wouldn't be implemented in phase 1 of the website launch. The document we produced
- There were a few aspects of the wireframes that required minor adjustments. We were able to make these changes and prepare the final wireframes for the UI phase.
- GMG had never conducted such an in-depth user testing piece before and could now utilise document templates and utilise proven processes for future projects.