At Marketade, one of the first questions we ask a new client is “what data do you have that sheds light on the problems.”  This data can come from many sources:  analytics about site usage, conversion numbers, usability insights and customer feedback.  A successful redesign starts and ends with data that informs opportunities and makes sure the website or app meets the users’ goals and expectations.

A few years back, we had a client who came to us to get their lead numbers up on their site and specifically asked for help with SEO. “Yes,” we said, “there are some SEO changes you should make. But the bigger opportunity we see is with improving the user experience and your conversion rate.” Their metrics told them WHAT was happening, so we suggested an approach to understand WHY:

  • Know your users by observing them using the site or app and talking to them about what they want to do and what they expect
  • Identify the insights from user research and implement new designs
  • Use analytics to help validate design changes and prioritize the improvements that might radically improve the business metrics

We went to work implementing an iterative program:

To kick off our project, we first got buy-in from the project team that they would be active participants in getting and understanding the data. The key to success was collaboration with the stakeholders throughout the research. There is no substitute for being there!

Collaborative User Research

Internal interviews inform. One of the first things we did was interview some key customers and customer support staff that worked for the company. We quickly gained an understanding of their customers: their behaviors, pain points, questions and concerns.

Personas are powerful. We identified 2 primary personas — along with 2 secondary personas — that we would refine over time as we talked to actual customers. We first worked with the subject matter experts to flush out the attributes of these personas.  These gave us an understanding of the types of users we are targeting and how to recruit for the usability testing sessions.  It is important to develop personas to inform the recruiting and participant profiles.

Test. Test. Test. We did usability testing of the site with users who matched the primary personas. Stakeholders had an active role in the testing sessions which was key. We have learned that when the stakeholders actively participate by watching the users trying to perform tasks, they have a much deeper understanding of the issues. The marketing director and CEO, along with their design and development team, all observed testing sessions and participated in defining the findings and recommendations.

Through this process, we gained a lot of clarity. In fact, over the course of multiple rounds of testing, the research team identified over fifty problems with the site and web app.

Web Analytics

Where do we start? To further understand the issues, we turned to Google Analytics to validate and get insight into the size of the problem. From the data, we could tell what areas of the site were getting the most engagement and which were not, and, combined with the testing insights, we could  refine the prioritized list of issues that everyone could agree upon.

Updated Design and Content

Opportunity knocks. We collaborated as a team to identify initial solutions for most of these problems through ideation work sessions. To help prioritize the work, we gave each solution an “Opportunity Score” of 1 to 10, which combined separate sub-scores for Impact and Effort.

New design, new content. For the areas with high priority scores, we iterated design and content. These included the most ambitious changes, as well as some that we hoped would be quick wins. But we needed user feedback on the proposed designs before we knew for sure they would be good solutions.

A/B testing

How to test without IT mutiny. It is not the best use of resources to ask already-busy developers to implement potential solutions for testing. So we used Optimizely, an online platform for comparative testing, to implement A/B testing.  This allowed us to bypass the developers and get data on the proposed designs we needed. We were able to test a couple of concepts every few weeks before we finalized designs.  And we were confident when we submitted changes to the dev team, they were going to make significant improvements.

Do not forget the users. Though it’s exciting to launch A/B tests and get quantitative results, you can’t drop the qualitative user research. Observing users interacting with the new designs offer three opportunities:

  1. Learn why a new design is failing.
  2. See ways that a winning new design could be tweaked to perform that much better.
  3. Uncover a completely new solution.

And the cycle continues.  Ah, this is what they mean by “data-driven design.” We had data from both A/B testing and prototype UX testing that gave the team a new idea, and this idea was the real game-changer! The data lead to a design change that really moved the KPIs in the right direction.

Contact us to help drive change with data-driven design.