Challenge: Differing Opinions on How to Improve Designs

80% of Alzheimer’s research studies are delayed because too few people sign up to participate. The Banner Alzheimer’s Institute has been on a mission to change this equation. Their recruiting goals are particularly challenging because the most promising Alzheimer’s studies are “prevention studies” that need participants with certain types of genes.

Over the last few years, the Banner team has come a long way. They launched the Alzheimer’s Prevention Registry and signed up over 270,000 members. They developed partnerships with top Alzheimer’s research organizations. And they built a Find a Study portal to connect Registry members with Alzheimer’s research studies across the U.S. and online.

Find a Study portal

The Banner team wanted to see more members engaging with this portal and connecting with study sites. But, along with their design and development agencies, they were struggling to agree on the parts of Find a Study that required improvement, and on how to improve them.  Fortunately, they had a tool to guide them toward consensus.

Solution: Regular Collaborative User Research

In early 2016, Marketade worked with Banner and its design agency, Provoc, to implement a program of quarterly user research studies. The program was built around 2 premises:

  • A deep understanding of users and their problems, gained through observation and interviews, is the best foundation for improving the customer experience.
  • All stakeholders on the team (including designers, developers, product owners, management) must observe the research and participate in the process of identifying findings and opportunities.

The centerpiece of the program was what we’ve called the “participatory research session”: a 2-hour collaborative workshop where stakeholders review the research and then go through a process to reach consensus on problems and opportunities.

This approach lies in sharp contrast to the “report-driven approach” used for most UX research — where the researcher conducts and analyzes the research sessions, and then presents findings and solutions to the team in a report or PowerPoint, perhaps with some quotes from users or video clips.

The report-driven approach is better than nothing, and can produce big gains if no research has been done in the past. But over and over again, we’ve seen the team-based approach drive the biggest ROI, especially when built into a recurring process. That’s because seeing is believing. There’s nothing like directly watching users interact with products, and struggling to achieve their goals, especially after you’ve heard what those goals are in their own words, and learned why this person is such a great fit for your product. Reading about their struggles and seeing highlights doesn’t compare to extended observation of their full experience.

Here’s how the process works for each quarterly study:

  • We meet with stakeholders to review upcoming design projects and decide on a topic for the study.
  • We team up with Banner to recruit representative participants for the study. The Registry has an engaged email list and monthly newsletter. We took advantage of that, along with their social media channels, to build a user research panel of about 125+ potential participants. For the typical study we recruit 5 participants that meet our criteria.
  • We schedule and conduct 1:1 research sessions with participants. Most often these are a combination of open-ended user interviews and usability tests.
  • We condense the full research video footage – typically 2 to 3 hours worth – down to 1 hour.
  • Finally, we run the collaborative session with the extended team. We spend the 1st hour watching the research. In the 2nd hour, we run an affinity mapping session to reach consensus on findings and to identify initial solution directions.

For affinity mapping, we use a modified version of Jared Spool’s KJ Method. Since the team includes people in Arizona, Oregon, DC and elsewhere, our approach is remote and uses collaborative mapping software like Stormboard or GroupMap.

Before showing the research, we give team members a focus question — such as “What are the biggest barriers to finding and participating in Alzheimer’s research studies?” — along with instructions on things to look for and the types of findings to capture.

In tightly timed segments over the next 30 minutes or so, team members 1) post their findings individually, then 2) group them, then 3) label the groups, and then 4) vote on top findings. Nearly all of this is done in silence to ensure all voices are heard, avoid group think and political debates, and keep things moving quickly.

By the end of this phase, we typically see 3 barriers or findings emerge as the clear favorites.  We capture all of the other findings and return to many of them in the future.  But unlike typical meetings, we don’t waste precious time debating peripheral problems or pet projects in this session.  The process keeps us focused and marching toward consensus on the biggest issues.

With the top 3 barriers confirmed, we ask people to brainstorm solution ideas individually for a few minutes, and then post their favorite ideas on the board, under the appropriate barrier.  If we have time, do some quick voting on these ideas.  Then we briefly summarize what we’ve accomplished, touch on next steps, and call it a day.

Result: Less Time Debating, More Time Solving Problems

In the year since we started working with Banner and Provoc to run these collaborative studies, the team has been able to reach consensus on strategy and design for a variety of initiatives and concepts.  These include a dashboard for study site researchers to manage their leads and participants, and a dashboard for Registry members to track potential studies.

The Find a Study section is about to be relaunched with a new design focused on solving the 3 barriers identified in last year’s collaborative session.  Prior to that research session, it felt like the list of issues to tackle with Find a Study were endless. Watching users go through the search experience allowed the group to quickly reach consensus on the biggest problems worth solving.