CASE STUDY
Partner Admin Dashboard Usability Tests


Overview


RoleUser ResearcherTimelineJan 2025Skillsusability testing, remote moderation, interactive Figma prototypes, data synthesis
ProblemPartners needed to efficiently manage and monitor their platform licenses at scale, but struggled with complex workflows and limited usage insights.

The team needed to validate whether the new partner dashboard design could streamline license management and provide actionable data to justify institutional investment.

ContributionAs the user researcher, I led the end-to-end usability testing process. 

I designed the research protocol, created interactive prototypes, conducted think-aloud sessions, analyzed patterns, and delivered prioritized recommendations for the product team.



Methodology


I conducted a focused qualitative study with 5 partner admins. Each 45-minute interview consisted of three phases:



  1. I asked participants to interpret license analytics dashboards and explain how they would use these analytics for decision-making.
  2. I created interactive Figma prototypes to test core admin workflows. Using a think-aloud protocol, I tasked participants with realistic scenarios: adding new students to classrooms, updating educator assignments, and removing licenses.
  3. I guided participants through the educator and classroom management interfaces, asking them to evaluate the usefulness of different metrics and identify missing features needed for their workflows.



Key Findings and Improvements


Through usability testing, I uncovered several critical insights about various aspects of the partner dashboard:


1) Bulk Addition Workflow 
In usability testing, 4 out of 5 participants failed to notice they could add multiple learners simultaneously, instead manually adding learners individually, making it the most significant pain point.

Based on these findings, I proposed enhancing the interface's clarity by making the bulk addition feature more prominent. My redesign recommendations included a banner and upfront preview boxes to demonstrate the capability.



2) License Overview Charts

Usability testing revealed that the license overview dashboard failed to provide actionable insights, particularly for large organizations where aggregated data made it difficult to understand site-specific patterns. This frustration was captured by Natalia, who noted:


I recommended redesigning the interface to include site-level filtering and flexible time controls while replacing the underutilized trend graph with a straightforward active license count. 

These changes focused on giving users direct access to current and historical license data at the level of detail they needed.



3) Activity Metric Evolution

Initially, we used chat message count as the primary metric for monitoring learner engagement with Coach. However, all participants dismissed chat count as a meaningful metric. As Mark, a program administrator, explained: 


Understanding that administrators need metrics tied to concrete units of engagement—like completed activities rather than message counts—we made the following changes to metrics shown:

  • Transformed “learner chat count” to "number of activities started by a learner"
  • Transformed “educator chat count” to "number of activities assigned by educator"


Impact and Next Steps


Through this research, I identified and solved three critical admin pain points. Targeted improvements delivered impact in both operational efficiency and retention:

The research insights from this project delivered immediate design changes and informed future direction: the team is now developing an embedded, customizable Metabase table for admins to perform granular analysis and launching a new research stream to understand how admins measure learning success. 

Through this work, I learned that well-scoped research can both unlock short-term, tactical improvements and motivate strategic opportunities.
last updated 12/2025