User research

With any product, an important — if not the most important — thing to consider is the users; they who will actually use your website, product or service. When I was first introduced to the concept for Mosaic, it was exciting but posed two big challenges from a user experience perspective.

First, it was an idea that the team had come up with. It wasn’t based on existing research or insights into audience needs gathered from any of our existing (Wellcome Trust) digital products. It was brand spanking new.

Right at the beginning of the project, I wanted to find out if there was an audience for the product that was being proposed. Working with research agency System Concepts I started to look at who the potential audiences for long-form, non-news articles about science topics would be.

We recruited a some participants based on media and reading habits, and approached members of the public who were out and about in venues (such as the Science Museum in London) that we thought would attract the sorts of people likely to be interested in Mosaic-like content. We used a variety of methods, including diary studies and interviews to get an understanding of how interested folk would be in Mosaic, and as a result we established a few primary audience groups for consideration. Analysing the results, these potential audiences narrowed down roughly into three groups:

  • People who have an enduring interest in and enjoyment of science, perhaps informed by studying science subjects at A-level or degree, but who weren’t science professionals;
  • Academics, researchers and those working in a science field;
  • People who were curious about specific topics, perhaps a particular condition, due to personal contact with someone affected.

The next step was to find out more about these groups and how they read articles online. Do they? Are they likely to spend time reading an in-depth article, and if so how and where they would go about it? Would they set aside a block of time to do it, or would it be more casual? Would they read on a desktop computer or would they be more likely to use a tablet or a smartphone? Were they interested in extra information about an article? Was engaging in conversation with their fellow readers something that appealed, and if so to what extent? System Concepts recruited a set of participants based on the more detailed criteria from our initial research, setting up in-depth diary studies and group discussion sessions. We used these to explore more about the behaviour of the groups and how relevant Mosaic would be for them.

Our key finding was that it would be very difficult to create a product that worked for all three groups. At the Wellcome Trust we already aim a great deal of our content at people who are well established in the world of biomedical science. The team decided that Mosaic was an opportunity to branch out and offer exceptionally good quality science writing to people who were outside this group. Our research findings showed that if we created a product for people who had an enduring interest in science, it should also appeal to those who were driven by an interest in a specific topic, so we decided to aim Mosaic at this first group as a priority.

Taking the all the findings from the research I was able to build up a picture of the users we expected Mosaic to appeal to and the features we should include to meet their needs. All of this was shared with the project team, initially in the form of personas as well as reports and presentations. However, I was wary about getting too bogged down in documentation, which brings me to the second challenge of the Mosaic Project.

The project team were proposing that the development of Mosaic, including its proof of concept prototype, would be run using an Agile project management methodology. This is a great way to create a product, however sometimes it can be difficult to fit user experience into this rapid, iterative approach.

Thankfully, the upfront research we’d done helped in mitigate one of the common issues development teams face when attempting to integrate user experience into the Agile process: time for user research. As we’d already done this before development started, I didn’t need to try to find time during the design and development stage, so didn’t hold up the rest of the team. I already had an accurate picture of our initial users and was able to share this verbally at any point. I was basically a walking, talking repository of research information.

I shared a lot of this upfront with the team before we started working on structure for the site. It helped inform our responsive approach, create the content model and contemplate how users were most likely to travel within the site.

This information was also useful in managing ‘scope creep’. While there were lots of interesting ideas arising throughout the development of the project, our knowledge of the users — what they really wanted and how they were likely to behave — helped us concentrate on creating a high quality, simple product (a minimum viable product in Agile parlance) and park some of the other concepts for investigation later.

I wanted to make sure at this point that we didn’t get too complacent about the product we were proposing, especially as this was the first time any of us were creating a product where users were likely to attempt to read long form content on handheld devices.

Working with participant recruiter Criteria, I set up a small usability testing panel consisting of five potential users selected to be very close matches to Mosaic’s personas. These people came into our offices throughout our first prototyping phase and we tested all elements of the Mosaic site with them as we created it. This helped us refine features, understand interactions, see how people responded to various concepts and get a good grasp of how people were going to use our product. We repeated this for our second phase of prototyping.

It was a really interesting approach. The panel built up a great rapport with the development team and discussing the actual product, and watching their behavior as they used it, was incredibly valuable. However, as is so often the way with usability testing, after a while each panel became very familiar with our goals and started to say what they thought we wanted to hear rather than behave naturally. So while I definitely recommend this repeat panel approach I’d do it in short bursts with lots of varied tasks and change panel members as regularly as you can afford to.

A great deal of effort has gone into trying to understand the users for Mosaic and build the product according to their behavior and needs. However, until Mosaic launches all of our findings are simply highly detailed hypotheses; we won’t know how our users will behave until they’re actually using it. I’m excited for Mosaic to be unleashed so that I can test these hypotheses and spend time observing how the real users use the real product so that we can continue to improve it.

Nancy Willacy

Nancy Willacy heads up the User Experience team at the Wellcome Trust.