B2B Software Case Study:
Usability Testing with IT Security Managers for Malwarebytes
Before starting a redesign of its B2B software product, a cybersecurity company wanted a deeper understanding of its users and their pain points. Marketade led qualitative research and a stakeholder workshop that helped product and UX team members align on the biggest opportunities.
Cybercriminals often enter corporate networks through employee desktops, laptops, and mobile devices — or what IT security professionals call “endpoints.” The rise of remote work during the Covid-19 pandemic increased the number of endpoints that could be hacked, heightening the risk of data breaches and other cyber threats.
Malwarebytes Nebula is a leading cloud-based platform for endpoint security. Small, mid-size, and large businesses use the product to manage the security of employee devices and to prevent cyber attacks.
A Malwarebytes team was recently considering a UX redesign of part of the Nebula product. To help guide decisions around this redesign, the team partnered with Marketade to conduct contextual interviews and usability testing of the current product, with a focus on the deployment flow.
Their specific research questions included:
- How do IT admins/managers deploy their endpoint security products?
- What are their reactions to Malwarebytes’ deployment process? What pain points do they experience?
- How successfully can they accomplish their goals using Malwarebytes’ process?
We used our standard 5-step approach for this project:
- Discovery: Aligned with the project team; planned the project; designed the research.
- Recruiting: Sourced, screened, selected, and scheduled research participants.
- Research: Moderated 1:1 remote interviews.
- Workshop: Facilitated a remote workshop with the Malwarebytes team.
- Report: Delivered a summary report with prioritized opportunities and solution ideas.
To start the project, we facilitated a 1-hour kickoff meeting with the Malwarebytes project team. We followed this up with a couple of product walk-throughs.
We then wrote a recruitment plan that defined the target audience and provided 2 rounds of screening questions: one for our online survey and one for follow-up screening via phone calls.
Next, we wrote a discussion guide for the research sessions that included:
- 25+ potential questions for the 1st part of the session: a contextual interview
- 2 scenarios — each with 10+ potential tasks, sub-tasks, and questions — and 6 potential follow-up questions for the 2nd part of the session: usability testing
Excerpt of our interview guide for 1:1 interviews. Moderators used this as a guide only and adapted their questions based on the discussion flow.
We recruited 10 participants who were:
- Responsible for deploying and managing endpoint security software at their company
- At a small or mid-size organization (100 to 500 employees)
- In a Windows or hybrid environment
- Currently using a cloud-based security product
- Familiar with a specific Microsoft network security service
Beyond this, we recruited participants that represented a mix of deployment approaches (e.g. manual vs. tool-based).
Finding these participants was difficult given the specificity of the target profile and the nature of the subject matter: people who manage cybersecurity for their company are reluctant to participate in interviews where they need to talk about their security practices and processes. Fortunately, we have conducted plenty of recruiting in the digital security space and were able to address concerns and build the necessary trust with these participants.
Excerpt from our online screener used to narrow down research candidates sourced on LinkedIn and other places. Later, we conducted phone screens to further narrow the pool.
We conducted one-on-one 60-minute interviews with the 10 participants.
In each session, we interviewed the participant about their current role and how they handle endpoint security deployment. Then, we asked the participant to share their screen, log into a virtual machine, and use Nebula to complete a deployment task and provide feedback.
We use a team-based approach for synthesizing user research because it brings teams to a consensus much faster than other methods. It also results in more solution ideas going live.
7 Malwarebytes team members participated in this half-day workshop. Their roles included:
- Head of UX
- UX researcher
- VP of corporate products
- Lead product designer
- Manager, software development
To capture and prioritize user feedback, workshop participants:
- Watched 3 research session recordings and took structured notes on what they observed prior to the workshop.
- Prioritized the notes most relevant to the research questions and put 15 notes on a virtual whiteboard.
- Reviewed the shared notes and sorted them into groups based on common themes.
- Voted on which themes/groups were most critical to address in the workshop based on the project goals.
The workshop team identified 20 finding themes. Of those, 3 were prioritized as “quick win opportunities” and 3 were prioritized as “big opportunities”, i.e. those having the biggest potential to improve this product flow’s UX.
For each of the 3 quick-win opportunities, the full team explored and prioritized solution ideas, and assigned owners to each viable solution. One of the solutions was simply to remove a certain section of content for first-time users of the product.
For the 3 big opportunities, we broke into 3 small groups and assigned each group an opportunity.
Each group used a document template to capture the problem and possible solutions. We followed the steps below:
- State the users’ problem(s) and note the underlying issues with this problem in 2-3 bullets.
- Note specific references to the issue from the research using the virtual whiteboard.
- Brainstorm solutions or paths to solutions and note risks to success.
Then, each group presented their solutions to the larger workshop group and discussed any questions or comments on their solution(s). Based on the discussion, the team agreed on the solutions to pursue further and assigned the next steps for each.
A solution poster created by a breakout group during the workshop. Team members began by describing the problem. They then explored and captured solution ideas. Finally, they identified actions and owners to move the most promising ideas forward.
After the workshop, we wrote and delivered a 26-page report that included:
- 1-page summaries of the 10 research sessions, including their current deployment process and their pain points with the Nebula process.
- Documentation of the findings, solution ideas, and next steps generated in the workshop.
- Interview guide, participant information, and other research process artifacts.
Tables of contents from our summary report.
By the end of this research project, Malwarebytes had gained a deeper understanding of its target audience, users’ current security deployment processes and challenges, and user pain points with Nebula’s deployment flow.
Based on these user research insights and our collaborative process, the Malwarebytes:
- Identified 1-2 specific solutions for each of the 3 “quick win opportunities”.
- Generated, discussed, and prioritized among 14 specific solutions in their 3 “big opportunity” areas.
- Identified and discussed 6 potential risks to success for these solutions.
- Agreed on 13 specific next steps and owners to move the most viable solutions forward.
After the project, we received the following feedback from Malwarebytes team members:
Collaboration was successful and focused us in on key priorities to fix
— VP of Product
Excellent research and workshop activities. The participants were true potential customers. And the affinity clustering was a great tool.
— Manager, Software Development
We appreciate your excellent service and look forward to future partnerships with you.
— Director, User Experience
Out of 20 finding themes identified during the research, the Malwarebytes team used workshop voting to reach alignment on the biggest opportunities. NOTE: this graph generalizes the actual findings for confidentiality reasons.
About the Project
- Industry: Software & Technology
- Platform: Software
- Audience type: B2B: Mid-size and small businesses
- Audiences: IT security managers, system administrators
- Methods: Usability testing and contextual interviews
- Length: 2 months
- Stakeholders: Product team and UX team
- Company size: 875 employees
More Case Studies
Usability Testing with Interior Designers for Phillip Jeffries
To guide a B2B website redesign, Phillip Jeffries hired Marketade to lead its first-ever usability testing study. The collaborative project led to “team alignment on how customers are using the site” and a concrete “action plan.”
Website Usability Testing with Consumers & B2B Users for BankFinancial
A bank’s marketing team was preparing to launch a new version of its public-facing website for consumers and businesses. They partnered with Marketade to lead pre-launch usability testing on their staging site.