Challenge: “This is Not What I Expected”

For the first time in 10 minutes, the bulldog lover in our usability test was speechless.

She’d been browsing the PetSmart Charities website, learning about the non-profit and its efforts to end pet homelessness. She was impressed by the stats, and moved by the stories of pet lovers adopting dogs and cats. And she was thinking aloud as she went.

Then she saw a signup box for an email newsletter and proceeded to enter her email address and click “Submit”. Up came this screen:

After the long pause, she said “this is not what I expected. It’s asking for my address? Why do they need that?” She went on to complete the form, with various hesitations, questions, and frustrations along the way.


This session was part of quarterly user research and testing cycle that we helped PetSmart Charities implement. Each quarter we worked with them to conduct usability testing in parallel with Google Analytics analysis, focusing on 1 application flow or content area at a time. We then shared and discussed the findings with their “Website Committee” — a cross-section of stakeholders from communications, IT and other teams — at the end of the quarter, along with KPI results. Where possible, we’d propose an A/B test for next quarter based on the research, and then report on the results of that in the next meeting.

Unlike many organizations that start a process like this and abandon after a couple rounds, the PetSmart Charities team stuck with it quarter after quarter, no matter how busy they were with redesigns or other projects. The following is just one example of how their discipline paid off.


The bulldog lover wasn’t the only participant in that quarter’s usability study to be confused and frustrated by the newsletter signup process. We quickly saw a pattern across the pet lovers in our study. Most frustrated were the users on their phones, who had to work that much harder to answer all of the form’s questions.

We jumped over to Google Analytics. Was our study group representative of the larger population of PetSmart visitors when it came to signing up for the newsletter? Yes, said the quantitative data. 3 out of 4 people who started the newsletter signup process abandoned before completing the flow. And many of these people were also leaving the site at that point.

Next stop was the team at PetSmart Charities. “What happens on the backend when someone does this?” we asked, showing them video of a usability test participant submitting their email address and then stopping on the “Signup” page without completing the process. “Do they still get added to your newsletter list?”

“Hmmm … we don’t know. We’ll find out”, they responded. And off they went to the fundraising team that manages the email list.

Days later, they had the answer. None of the “signup page abandoners” were getting captured, representing a major lost opportunity for engaging hand-raisers and bringing them back to the site to donate, find adoptable pets, and more.

Solution: A/B Testing of Simplified Form

Usability testing had not only exposed a problem, it also pointed us toward a solution.

By hearing representative users think out loud as they interacted with the site, we had some clear ideas for how to overhaul the form and provide a better newsletter signup experience.

The PetSmart Charities team told us which pieces of information were must-haves. With that, we went to work and proposed the changes below.

After some back-and-forth with PetSmart and their design agency, Provoc, we arrived at a new Signup page:

Using Google’s A/B testing tool, we then launched an experiment that sent 50% of newsletter hand-raisers to the original page and 50% to the new page.

Result: Completion Increase from 27% to 71%

Over the course of the experiment, the original page converted 27% of Signup flow visitors, in line with historical analytics data. The new page converted 71% of visitors — representing a 167% increase in the newsletter signup completion rate, with 100% confidence.

Other KPIs went up as well, including adoptable pet searches and interest in adoption events. The new page was keeping visitors on the site and giving them opportunities to take other actions.

“This is perfect timing!” declared the manager at PetSmart Charities when she first saw the results. As the person who oversaw the website, she was trying to build support for applying this same approach to their online donation forms: usability testing, supported by web analytics, then design changes supported by A/B testing.

Now she had proof that the approach could work.

A week later, we presented the test results in the quarterly web committee meeting, then turned it over to the manager, who pulled up this slide:

The buy-in was quick. A few weeks later, we were off and running with a new round of usability testing, spotting problems and opportunities with the donation process.