Ongoing Research Case Study:

How a Mixed-Methods Research Program Tripled Leads in 1 Year and Won Over Executives

A 4-step qual/quant research process became a habit and success story across a 4,000-person company. A once-skeptical VP became the company’s cheerleader for user research, helping to spread user-centered design to new business units. Along the way, a team increased conversions by 2000%.

Challenge: Skepticism of Small Sample Sizes

Kimberly kept surprising us.

She was using an expert marketplace website, trying to find the right specialist based on her needs. The bare-bones homepage had one call to action: a large search box followed by a button, with some supporting text.

Kimberly ignored it and clicked the navigation at the top. After a few minutes, she scrolled down another page, found an advanced search option, and used the last of 4 possible filters — “expert location” — to start her search.

“Let’s see who they have near me,” she thought aloud.

“Huh,” muttered one of the stakeholders in the virtual meeting.

 
 

That is until User #3 started. He behaved just like Kimberly, quickly finding the advanced search and filtering by location.

User #6 followed the same pattern.

In the end, half of our 6 participants used the marketplace in the same, surprising way: mostly ignoring the default path and using what seemed like a workaround.

* * *

A year earlier, product owners had designed the site based on assumptions about what matters to target users, including:

  • When searching for this type of expert, users care most about the expert’s specialization.
  • Since they are unlikely to visit the expert in person, most users do not care about the expert’s location.

Now, as we analyzed and discussed the research, team members were saying things like “we were wrong about our users” and “we need to change how this works.”

Not so fast, said the marketing VP who was popping in and out of the meeting:

“This was only 3 people from a sample of 6 — in a research study. Are we really going to change the site based on this? I’d like to see some bigger numbers …”

Get Tips on Recurring UX Research

Want to build a scalable program of continuous user research? Sign up to receive a new case study each month featuring our best work and ideas in ongoing research.

 

Approach: Marry Qual & Quant

For research experts who understand the power of in-depth testing with small numbers, it’s easy to be annoyed at stakeholder skepticism around qualitative data — and at requests for “proof” or “bigger data.”

Over time, however, we have learned to embrace these requests and treat them as opportunities to win over new teams.

In the case of the marketplace site, we started by asking questions to better understand what was driving the marketing VP’s skepticism.

We learned about his bosses, their concerns, and how he shares information with them. We learned about past failures and how much was at stake for him. We learned about what it takes for him to secure additional funding for site redesigns.

After we listened and explored some options, we said:

“We get it. We’ll look at analytics data and see how it compares to what we just watched — and get back to you.”

We grabbed a member of the project team with strong Google Analytics experience and paired her with the research lead on this project. They spent an afternoon digging into the data and creating some custom reports.

The next day, we had our answer: 3 times as many users were searching by expert location as by specialty.

Search behavior data graph from Google Analytics data

Far from an anomaly, Kimberly behaved just like most people on the site. We now had data from tens of thousands of sessions to prove it.

We shared this quantification of the usability finding with the workshop team. The skeptical VP was excited, partly by the specific finding but even more by what the exercise showed: that we could use real-world analytics data to validate messy “lab” research findings.

Result: Short-Term Lift, Long-Term Transformation

A few weeks later, the team sketched and implemented a simple change to the site’s homepage.

Based on the qual and quant research, they hypothesized that adding a location search option to the existing specialization search option would increase leads.

They were right: in an A/B split test, the new homepage converted visitors to leads 50% better than the original. It seemed like a small change — 2 CTAs instead of 1 — but it had a huge impact on the conversion rate. The team quickly made this the new default design, and monthly KPIs jumped.

The VP was thrilled. He now could see a complete story of how qualitative and quantitative research can work together to drive UX improvements and business results.

  • Qualitative research uncovers an insight;
  • Analytics data quantifies the insight;
  • The quantified insight inspires a design change;
  • A/B testing validates the design change.
 

The long-term loss would have been much greater.

What nobody knew at the time was that this one project and story would set in motion a series of events that led to an organizational transformation. It went like this:

  • The marketing VP shared the story and early results with his bosses and executive peers, paving the way for more UX investment in the marketplace.
  • The team turned the 4-step qual/quant process into a habit, developing a recurring research program.
  • The VP rallied the team around the mantra “evolution not revolution.” Their model: place small bets based on research, expand the winners, toss the losers, and repeat.
  • Within a year, marketplace leads tripled. A year later, they doubled. Two years later, they doubled again. Over the course of 7 years, leads increased by nearly 2000%.
Annual leads graph

The product and process became a huge success story across the company. The marketing VP became the company’s most vocal cheerleader for qualitative user research, helping to spread user-centered design to new business units.

In one meeting, executives from a different business unit invited the VP and asked for his advice as they prepared to launch a new product. As they peppered him with questions, he returned to the same response, eventually saying:

“You just have to watch your customers use the product. I don’t know what else to tell you. You will be blown away.”

Lesson: Embracing Requests for Bigger Data

Every week or two, we facilitate some kind of research workshop with a client team. While the details are always different, the starting point is the same: the team spends 1-3 hours watching in-depth qualitative user research.

Often there are qual newbies in the workshop — developers or product managers or marketers who have never watched a moderated usability test or a user interview. Over and over, we see this immersive process erase skepticism around small-scale research. After the workshop, we’ll hear things like “I didn’t think that 6 users would be enough to see clear patterns, but yeah, it was — I get it now.”

When it comes to convincing stakeholders that 5 or 10 participants are enough, there’s no substitute for getting them to sit down and watch a few full sessions.

 
 
And yet, executives remain the toughest sell. It’s much harder to get them in the workshop in the first place. When we do, it’s hard to get them to stay put for a morning. And even then, there’s often a remaining hurdle.

As with the marketing VP in our story above, many executives are surrounded by big-data metrics and dashboards every day; their bosses give them KPIs or OKRs and challenge their resource requests by asking for data.

Even when they watch mind-blowing qual research, executives often need quantitative backing before they sign off on design changes or UX initiatives.

For user researchers, this can be frustrating. But if we view these requests as an opportunity — to partner with new colleagues, to marry lab findings and real-world data, to win over critical stakeholders — the payoff can be huge. The once-skeptical executive might become the company’s most influential champion for user research, allowing it to scale in ways we never imagined.

 

Get Tips on Recurring UX Research

Want to build a scalable program of continuous user research? Sign up to receive a new case study each month featuring our best work and ideas in ongoing research.