My role in the project

Interaction Design
80%
Wireframing
20%

A bit of background…

Forty Winks is an Australian speciality bedding retailer, who offers customers an impressive range of bedding related furniture and accessories. They sell their products both in-store and online, and they were in desperate need of an online store refurb.

With a firm foundation of UX research conducted by our XD team, I was given the task of creating "delightful" animations for each key user interaction for the desktop and mobile versions of the online store. I was also tasked with assisting the XD team with wireframing and flagging potential user interaction issues.

 

 

The Challenges

This was a super exciting project to work on, mainly because in my past roles, adding "delight" was viewed as an afterthought rather than a necessity. The main challenges were as follows:

  • Figuring out precisely what makes an interaction "delightful" as opposed to "annoying" or "for the sake of it".
  • Bridging the gap between the XD team and development team to ensure interactions were applied to the end product as intended.
  • Learning what the development team needed from the design team to provide animations/mockups in a format that reduced rework and confusion.
 

 

The inspiration

I really love what Google has done with their material design documentation, in particular, around motion. I used this as a basis for most of my decisions for the motion design on the Forty Winks website simply because I trust Google's motives. They know what makes a good website (desktop and mobile) because it's their jobs to know.

An exerpt from the motion section of the Google Material Design documentation. It's full of great examples of what to do and what not to do.

 

 

 

Planning

The Forty Winks online store incorporated thousands of products and over one hundred wireframe screens across desktop and mobile, so knowing which user interactions required motion design was a task in itself. I needed a good planning tool where I could lay out screens and annotate on why I thought it either did or didn't require motion design. The planning tool would also be used to prioritise the most important user interactions so I could allocate an appropriate amount of time to each task. Mark Seabridge (Head of UX at Tribal Worldwide Melbourne) suggested RealtimeBoard which he described to me as "an online version of a whiteboard".

RealtimeBoard is very impressive.

I often see UX designers sprawling post-it notes all over a whiteboard, or covering a wall, and then praying the cleaners don't mistake it for trash and destroy their work overnight. Don't get me wrong, I can see the benefits in putting ideas down on paper for group workshops but a lot of time can end up wasted on scanning, sorting and taking photos with your iPhone. That's where RealtimeBoard comes in handy:

  • Digital post-it notes that you can easily resize no matter how much text you need to add to it (can't do that with paper).
  • Draw lines and arrows connecting screens and ideas that can be changed/added to at any time.
  • Save, share and collaborate your ideas digitally at any time (perfect for busy teams that don't have the luxury of all huddling around your pile of post-it notes for you to explain your ideas!)
 
 

Post-it notes can come in handy, like when you need to conduct a group workshop or like when you want to make a Super Mario stop animation. They can also be inefficient, not to mention not-so-friendly on the environment. So, I generally try to avoid using post-it notes wherever possible and instead use RealtimeBoard.

 

 

Sorting interactions

I started copying wireframes from Sketch into RealtimeBoard and began sorting them into key groups.

NOTE: As of now there's a RealtimeBoard plugin for Sketch enables an easy and quick way to get images from Sketch right to the board. Would have been handy at the time!

To prioritise my work, I organised the user interactions with screens at the top representing the most frequently encountered user interactions and ones at the bottom, the least frequently encountered user interactions. Then I used a left to right approach with the screens closest to the left the highest priority and the screens closest to the right, the lowest priority.

RealtimeBoard

Here's a zoomed out view of the interactions I organised in order of priority.

After receiving feedback from the XD team, I began linking screens at the point of interaction and annotated on possible animations/transitions.

 
 
RealtimeBoard user interactions with annotations
RealtimeBoard user interactions with annotations
RealtimeBoard user interactions with annotations

I really love the resizable post-it notes feature. You can see the full board here.

 

 

Finding a better way

Before I started animating, I spoke to the development team to see what would be the best way to supply the mockups to reduce confusion and rework. Here were some of the options:

  • Option 1: Use software that utilises javascript or html5 (Principle and Framer) to build animated mockups that (in theory) could be pulled apart by the dev. team and more easily integrated into the site build.
  • Option 2: Use video-based software (After Effects) that showed snippets of animated interactions and provide the dev. team with the timings, easings and delays.
  • Option 3: Try and find a way of handing over animations that could seamlessly be integrated into the site build.
Trying to avoid frustrating the development team.

Designers can often be a source of frustration for the development team simply because we don't communicate as well as we should.

Due to the sheer scale of the site and detail of the interactions, my challenge was to pave the way for a more efficient approach of handing over designed interactions to the development team. I ended up utilising a bit of option 2 and option 3 (more to come below).

 

 

Bringing motion to life

By this stage in the project, the wireframes had been turned into high fidelity mockups, and I was able to import these into After Effects to begin testing some of my ideas from my RealtimeBoard planning phase. I took each user interaction and created a short animated video imitating the interaction (like a screen recording). Then I got feedback from the XD team and once approved, continued with the other interactions. The approach worked well, and I used our Forty Winks Slack channel to post the animated gifs for feedback.

 

The focus here was on how the hamburger icon seamlessly and subtly transformed into the close icon.

Some additional thought had to go into the transformation from the search icon to the close icon.

When a user changes their search preferences, it's important that they can see their changes occurring gradually to avoid the confusion of a sudden drastic change to the page.


 

A subtle lift and shadow are applied to product thumbs when the user hovers. This is inspired by the real-world experience whereby a person will lean in close or pick up a product that they are interested in.

 

 

The user can see changes occurring to the product listings based on their selection.

 

 

I took a playful approach with the favourite button. It was an opportunity to inject a playful/lively element to the page.

 

 

The add to cart button needed different states to inform the user that their order had been added to the cart. The rotation of the loader shows the user that something is generating in the background and the drop-down drawer emphasises that the product is now in the cart.

 

 

Handing over to the devs

Now that I had created videos demonstrating the animations for each user interaction, I needed to give something to the dev. team that could be incorporated seamlessly and efficiently into the website build. I decided to include parameter panels within the videos that included details that could be applied to assets in the current site build. The feedback from the dev. team was this info was extremely beneficial and made their jobs a lot easier.

NOTE: There is now a tool for After Effects called Inspector Spacetime that injects motion specs into reference video automatically. Would have saved me a lot of time considering I had to do this manually.

The difficulties came around the more intricate animations, e.g. the progress loader button. Fortunately, I'd come across a fantastic new plugin called Bodymovin' that takes animations from After Effects and turns them into usable code (in the form of a JSON file) to make building highly complex animated SVGs an absolute breeze. If it weren't for this tool, I'd need to sit with one of the developers (for hours) and explain the timings, flow, feel and aesthetics of the loader and then watch them struggle to get it looking exactly the same.

A developer will never have the time to recreate complex animations, nor will they have the patience to get it looking perfectly schmick.

It's just not their job, so it's up to us animators to make their jobs easier. Designers aren't expected to code (well), so we shouldn't expect developers to animate (well). So Bodymovin' was an absolute godsend. In my humble opinion, it's by far the most revolutionary plugin to be added to After Effects in recent times, and the possibilities are endless.

 

 

Measuring delight

Measuring delight is not an easy thing to do. Sometimes it could be clear-cut — like AB split testing an animated version of a website vs a static version. But most of the time the delightful details are so small and subtle that it's more of a combination of animations that add to the user's delight at each touchpoint, rather than providing a singular moment of delight in one user interaction.

 
 

This "delightful" little loader was built in After Effects and exported via Bodymovin'.

 
 

Perhaps the best way to measure delight with a digital product would be to engage in user testing/interviews and ask questions like; "how do you feel when you click on this button?" or, "what does this animation communicate to you?" This would provide some excellent insight into whether or not users experience "delight" or not.