Designer
thumbnail-1.jpg

Quantified MVP

Quantified Communications

Quantified Communications uses machine learning to give feedback on 24 different communication skills using a 3 to 10 minute video of a presentation. I worked with a product manager and a team of local and remote developers to deliver a new MVP experience.

Role: UX Lead
Timeframe: September 2018 to January 2019


The Problem

According to LinkedIn data, communications is the #1 skills gap in the United States. High level executives might get expensive communication coaches later in their careers, but there wasn’t a scalable solution for employees and students to get personalized, measurable feedback. In order to improve communication, we needed to present a way for users to assess, develop, and improve their skills.

The Quantified Solution

Quantified Communications had a panel of communication PhDs quantifiably assess the largest communications database in the world. Using that data, they developed proprietary algorithms to train a machine learning model to think like a communication expert by making connections between the panel assessments and data from the algorithm.

model.png

Now that the model is trained, someone can upload a video of themselves giving a presentation and the model fills out the survey of 24 different communication skills as if a communication expert was watching the video.

model.png

Getting to MVP

When I got to Quantified Communications in September 2018, they had the machine learning models in place, but the experience of consuming that amount of deeply personal information was overwhelming. I focused on what I identified as core workflows: recording a video, understanding the assessment, and resources to improve.

Research

Given the short time frame of releasing the new experience, I started with two rounds of user sessions. The first round was with users who had never heard of the product to learn what expectations people had around AI for communication feedback. The second round was usability testing the new results page designs with existing users to find any gaps in presentation or understanding. I also ran heuristic evaluations to identify key problems in the existing user experience to set a baseline. Additionally, I had the opportunity to sit in on an in person workshop one of our coach’s ran for a client, where I was able to see how users were interacting with the product.

Key research findings from the designs tested above

  • How does machine learning evaluate communication? Users had many questions around how a computer could give such detailed feedback from a video. The results tutorial didn’t address those concerns in a simplified way, so I created a more visual explanation limited to the most vital information.

  • Are my results good or not? Lack of visibility of benchmarks per skill and color scheme led to new visual design pulling that information forward.

  • How do I improve? Although all users found more information about each skill, the amount of information and order presentation was overwhelming. The skill detail page was rearranged to reflect the order of the resources that resonated with most users: an example of the skill done well and steps to improve.

  • A need for results on mobile. From the in person workshop I found people were so interested to see how they scored that they were checking their results on their phones if they didn’t have their laptops with them. I required mobile workflows as part of the MVP and continued to monitor device usage through Google Analytics.

Design system and style guide

For development, we decided to keep Angular Material Design as the base for the design system. This was the easiest way to address accessibility needs and allowed us to build the custom components for a larger design system. We adjusted the theme to match the new style and tone of the product: serious, casual, respectful, and enthusiastic.

Release

After four months of development, we released a brand new end-to-end user experience for the core workflows of the product. Since then, we’ve transitioned to releasing every three weeks.

The recording workflow allows users to see the prompt as they work on a video, learn how to be successful, and record in the platform or upload a file from their computer.

The recording workflow allows users to see the prompt as they work on a video, learn how to be successful, and record in the platform or upload a file from their computer.

The results page uses clear benchmark comparisons per skill, bold colors for skill categories, and neutral colors for scores to avoid confusion around good or bad scores.

The results page uses clear benchmark comparisons per skill, bold colors for skill categories, and neutral colors for scores to avoid confusion around good or bad scores.

The results page on mobile so users can check their assessment as soon as they get the email on their phones.

The results page on mobile so users can check their assessment as soon as they get the email on their phones.