Financial Services Case Study:

How Holistic User Research Helped Remove Self-Service Barriers

Facing silos and a “stuck” KPI, a company embraces collaborative research, mixed methods, and rapid experimentation — and turns executive scrutiny into praise.

Photo by Simon Fitall on Unsplash

Challenge: Low Online Adoption for a Key Service

A Fortune 500 company with over 10 million customers was making steady progress on a key business goal: getting more customers to use its digital self-service channels rather than phone into its call centers. Across numerous product/service lines and transaction types (sales, support, etc.), for years the company had consistently reduced costs while improving the customer experience by making it easier to transact online.

And yet, for one critical service line, the online transaction share was stuck. Only 1 out of 10 customers completed the transaction online. Despite plenty of focus and investment — and data showing that many customers preferred not to call a support center — this service team struggled to move the needle.

The online transaction share was stuck at 10%.

Their lack of progress drew the attention of executives. As the company’s customer base grew, having 90% of these transactions handled via phone threatened to overwhelm its support center staffing and costs. Management also feared that customer retention and profitability were suffering as the company disappointed customers who wanted to transact digitally.

Approach: Collaborative Research & Iterative Testing

The company partnered with Marketade to conduct a mixed-methods, “team sport” research project. The project aimed to:

  • Give a diverse set of teams a deep understanding of this service’s customer experience and customer mindset.
  • Identify barriers to customers using the self-service channel, including: gaps between product positioning and customer perceptions; confusion around the product and how it works; and reasons that customers call rather than transact online.
  • Generate viable initiatives to combat these barriers and increase self-service adoption for this product line.
  • Test and launch the most promising solution ideas, and measure the impact.

Here are the key steps we followed:

In order to gain a wide and deep understanding of the service line’s challenges, we drew upon both qualitative and quantitative methods to lead a multifaceted research initiative focused on the current state of the customer experience. In cooperation with multiple company stakeholders across various departments, we conducted the following research:

  • Interviews with 10 stakeholders representing a diverse set of teams across the organization (IT, business, marketing, etc.)
  • Review of existing user research
  • Analysis of 4 competitor products and their customer experience
  • Review of analytics data for digital transaction flows (mobile app and web) and phone vs. self-service results
  • 1-day field visit to a call center, observing/interviewing 15 phone reps
  • One-on-one interviews with 20 customers, many of whom had recently used the product
  • Usability testing with 12 customers, to identify friction points in digital flows
  • Card sorting study with 30 participants, to understand what broader product areas people associated with this service
  • Survey of 200 current customers, including a subset who had recently transacted by phone for this product

Based on our research, we identified 3 broad problem areas that were hurting self-service adoption.

Within each problem area, our researchers documented a number of specific barriers to online adoption. Yet we knew from experience that if we just presented these barriers in a report, our research would fail to drive action and results. So we kept the specific findings in our back pocket and turned to what we know works: team sport analysis and ideation.

We facilitated a virtual workshop with key stakeholders across 5 different teams. We structured the workshop as follows:

  • We outlined the 3 broad problem areas we’d identified in our current state research.
  • Stakeholders listened to in-depth excerpts of the research interviews and took structured notes that captured specific self-service barriers.
  • We guided the team through an affinity mapping exercise to reach a consensus on the biggest barriers and opportunities. Stakeholders added their findings as notes to the virtual whiteboard. They grouped similar findings together and gave each group a name. They then voted to identify the top 3 biggest barriers.
  • The team brainstormed solutions to the top 3 barriers, then voted on those solutions.

Prior to this project, these 5 teams had worked in silos as they tried to improve KPIs for this service. Yes, they shared their tactics and results by email and in meeting presentations. But they lacked a coordinated effort built on a shared understanding of problems and opportunities.

This workshop was the critical step in shifting from silos to alignment. By gathering in the same (virtual) room, observing the same user research, and analyzing and brainstorming together, the workshop magically got the team on the same page.

The workshop magically got the team on the same page.

It also motivated the team to act. At one point during the workshop, the VP of the service line leaned over to the UX team manager and said: “This is probably one of the best WebEx sessions I’ve ever sat through — very interactive and it’s keeping me engaged!”

Based on the alignment and excitement generated from the workshop, it was tempting to move straight into solution design and implementation. But we weren’t quite ready; the ideas generated in the workshop were broad, rough, and full of untested assumptions. To be able to run with solution ideas that had the best chance to move the “stuck” KPI, we needed a few additional research outputs.

Based on the current state research and the workshop, we created a set of customer personas for this service. Each persona focused on a specific problem area identified in the collaborative analysis.

Next, we took the most important persona and mapped their existing customer journey, chronologically plotting the barriers the team had previously prioritized. This mapping exercise helped the team:

  • Visualize the most important obstacles in the context of a real customer’s experience with the service;
  • Appreciate how barriers that are small in isolation can combine to create a very poor experience over the entire journey;
  • Focus on a previously ignore gap that existed early in the journey: the lack of customer awareness that this service transaction could be completed online.

Again drawing from the research and the collaborative ideation process, we then created an ideal customer journey. In this exercise, we turned the barriers and frustration points into solutions and areas of delight. The visual demonstration of where we could gain the most by turning frustration into delight helped us to focus and prioritize our solution exploration.

Finally, using the lens of personas and journey maps, we conducted a content audit across a wide range of channels and digital properties. We knew from our previous steps that, whether in the form of a marketing campaign or a call center script or mobile app microcopy, content was a major contributor to existing barriers.

Tie a content audit’s findings to personas and journey maps — which in turn are grounded in research and team collaboration — and you create a formula for action.

The journey maps helped the team to see how many channels a typical customer touches for this service — and how turning content inconsistencies into a cohesive experience could transform user uncertainty into confidence. Our audit also spotted many instances of confusing or missing content that were driving customers to call.

Too often, personas, journey maps, and content audits sit on stakeholders’ shelves and fail to drive change. By combining the tools and tying the content audit’s findings to personas and journey maps — which in turn are grounded in research and team collaboration — we created a formula for action.

We were now ready to regroup with the teams, design specific solutions, and run experiments.

At this point in the project, we had:

  • Conducted in-depth current state research, using a wide range of methods to interview/observe a diverse group of audiences;
  • Gathered stakeholders to observe and analyze the research — and prioritize the top problems and opportunities as a team;
  • Used the top problems/opportunities to generate personas and journey maps — which in turn provided visual, real-world context for a content audit.

By the end of stage 3, the team had generated plenty of solution ideas that they were excited to explore, design, and develop. And critically, they had a robust framework to pick the best of the best ideas — the ones that research indicated would give the most bang for the buck.

Over a couple of months, we coordinated 8 experiments across the following channels and products: Email, Website, Customer Service, Sales, and Mobile App.

Experiments included content and design changes to the following:

  • Confirmation emails
  • Customer login pages
  • Mobile app flows
  • Website product marketing pages

For many of these changes, we first tested the new copy and/or design through prototype UX testing with a small number of participants. We iterated based on what we learned before launching the new design. Where possible, we worked with the company’s metrics team to run an A/B test.

A researcher conducts an in-person qualitative observation session with a participant.

Result: Measurable Impact, Team Alignment & Executive Excitement

The experiments produced exciting wins for the organization. Of the 8 tests, 5 showed meaningful improvement to online adoption.

One of the results was particularly exciting. Based directly on what we’d observed in our current state research and what we documented in the personas/journeys/content audit, we spotted an opportunity to add one line of copy to one page of the mobile app. It was a point where analytics data showed lots of users were switching from online to phone, and where qualitative research told us why. The new copy we wrote directly addressed the uncertainty we’d uncovered in the research, and produced a notable shift in the percentage of users completing the transaction online.

One line of copy was all it took to cause a big lift.

In combination, the content, design, and service changes helped ignite a significant increase in the percentage of customers completing this service transaction online.

Online adoption: 4 quarters of flat performance followed by 4 quarters of rapid growth

The project also laid the foundation for long-term improvements to the customer experience with this service. In particular, the analysis workshop and follow-up sessions brought a diverse team together and, through the user research, gave them common ground for discussions.

To empower these teams to continue innovation, iteration, and experimentation, we built content guides, led interactive copywriting training, and created call center talking points.

A few months later, online adoption for this service was back in the spotlight with the company’s executives. This time, however, the attention was welcome. Senior management joined a meeting to hear the positive results and to learn about the research-based process behind them. As the meeting was ending, one of the execs thanked the core project team, and said “this is exactly what I was hoping for.”

About the Project

  • Industry: Financial Services
  • Platform: Mobile app and web applications
  • Audience type: Consumers
  • Methods: Interviews, user testing, card sort, journey mapping
  • Stakeholder teams: Product; design
  • Organization size: Over 20,000 employees

​More Case Studies

 

How 1–800-PACK-RAT Used Journey Maps to Start a Customer Experience Transformation

A top moving company interviews customers to understand their journeys — and collaborates to identify big innovations and small wins.

Home Buyer Journey Research for PenFed Credit Union

To guide a new PenFed initiative, Marketade conducted in-depth 1:1 interviews with 10 recent home buyers. We then defined 11 key home-buying journey steps and identified pain points and highlights.