Laurel Geddes
Laurel Geddes
Product Design Manager — Laurel.Geddes@gmail.com
 

User EXPERIENCE Design

Project Poirot – NEW Product Launch

Client Industry: Web Service & Cloud Computing
Client Revenue & Footprint: Big 5, Worldwide
Project duration: 10 months

 
 

Define and design the user experience for a high profile new offering, a fully managed machine-learning service that detects outliers in business and operational data.

 

My Role

User Flows
Information Architecture
User Experience
Prototyping

 
 

OVERVIEW

Partnered with the client team to plan and execute the user experience for a new machine-learning service that detects outliers in business and operational metrics.

The intuitive experience includes a wizard that guides users through detector set-up and activation, as well as an outlier dashboard that provides real-time insights and detailed visualizations on trends and enables users to take quick and targeted action based on the findings.

 
 

DISCOVERY

Users

“The Mechanic” (Developer/IT)

Oversees the functionality “under the hood” of the detector. Tasked with detector setup and management, cleaning data and connecting data sources, defining metrics (KPIs to monitor for outliers), and setting up alerts.

Needs an intuitive wizard to guide them through the process of setting up a detector. Clear instructions would need to be included in the UI, as the Help Panel feature was out of scope.

Business User (Marketing Manager/Analyst)

Tasked with reviewing results and providing feedback (indicate if results are true outliers or not) to tune the detector.

Needs concise and intuitive outlier results that provide them with at-a-glance insights, enabling them to pinpoint possible causes and react quickly to remediate outliers.

Use Cases

  1. The customer provides historical data to train the model and sets up live data to be collected automatically on a recurring schedule for outlier detection.

  2. No historical data is provided to train the model. Only live data is provided, so outliers cannot be detected until enough data has been collected to train the model.

  3. Single use, for experimentation and testing out the system.

 
 

INFORMATION ARCHITECTURE

 

Terminology & Mental Model

During our initial conversations the use of interchangeable terms caused confusion and communication gaps. To remedy this issue, I created diagrams to visualize the relationships between the elements and gathered a list of terms and their variants. After several iterations, we collectively agreed on the mental model, the preferred variant, the definition of each term, and validated maximum/minimum limitations (e.g., outlier versus anomaly, detector versus model, measure versus metric). This quick and effective exercise ensured that the team had a common understanding of the terms and their relationships.

 

User Flow

I created a basic wireframe flow diagram to provide a holistic view of the user experience. It was a useful reference for guiding conversations with stakeholders about user needs and goals, product scope, and business requirements. It also underwent many revisions.

Blue: navigation. Grey: wizard. Magenta: needs clarification. Red: out of scope.

Blue: navigation. Grey: wizard. Magenta: needs clarification. Red: out of scope.

 
 

PROTOTYPING

 

Visual design was predetermined by the client design system, so we were able to quickly create layouts to review with the project team. We used these to clarify questions and validate the UX in a full-day “Wall Walk” review with leadership and key stakeholders.

Wall walk review

Wall walk review

Wireframes

Wireframes

 

Looking for ways to simplify Complex Concepts

HIW panel.png
HIW panel 2.png
 
define metrics HIW 1b.png
define metrics HIW 2.png

A major challenge was how communicate the concept of ‘metrics’ to users. Users don’t actively create metrics themselves; rather, they assign the measure and dimension fields that are combined by the system to create metrics on behalf of the user.

The complexity lies in that there is no dedicated resource page for users to view metrics, or the measures and dimensions that comprise them. Instead, the measures and dimensions are displayed on the resource page of the datasets they are selected from, and the detection results are displayed by metric. Yet users are charged by metric count, so it is imperative that they be aware of the total count when assigning and mapping data fields.

We first established a hub that indicates the users progress in the setup process, as well as quick access to available actions. Having primed users, we then used the same visual language for collapsible instructional panels in each Create Flow. This provided a space to explain relevant concepts in context and proximity to the action.

 

The Assign and Map Fields step posed the greatest challenge, requiring many rounds of revisions. We explored versions with stacked vs tabbed datasets, versions that displayed the fields table separately for each field type vs all fields together in a single table, and even versions that combined field assignment and mapping into a single action. The the most elegant solution we devised is on the right.

Luckily, the scope was later restricted to a single data source and a single dataset, which greatly reduced the complexity of the flow. (though the solution must be scalable to accommodate the “fast-follow” ability to create multiple resources).

Users can assign fields from any dataset up to a specific max. To prevent users from losing track as they tab between datasets, we provided a summary section to tally the total number of assigned fields.

Users can assign fields from any dataset up to a specific max. To prevent users from losing track as they tab between datasets, we provided a summary section to tally the total number of assigned fields.

To make a selection, users must assign a field type. So we removed the redundant checkbox selector.

To make a selection, users must assign a field type. So we removed the redundant checkbox selector.

Explorations included using color and/or icons to distinguish between field types, so users could track the tally at a glance.

Explorations included using color and/or icons to distinguish between field types, so users could track the tally at a glance.

 

Constraints

Due to API constraints, the complex and confusing tasks of creating datasets, connecting to data sources, and defining metrics could not be completed individually. Rather, these resources would be created at once, necessitating an unusual wizard-within-a-wizard-within-a-wizard solution.

wiz within wiz.png

Curve Ball

The team had to pivot swiftly when the scope of the product expanded significantly mid-stream. The UX was modified to accommodate 23 connectors, versus the original 2.

As a team, we quickly and efficiently assessed the repercussions of the change in scope on the user experience and development, amended the business requirements accordingly, and identified and executed effective changes to the features and functionality.

Dropdown-connectors.jpg
 
 
 

Collaboration & Efficiency

Design and development timelines were nearly parallel, so quick iterations and timely hand-offs were crucial. The UX was subject to a rigorous and time-intensive internal design review and approval process, as well as multiple executive reviews, prior to hand-off to the engineering team. Thus, it was imperative to maintain close communication between all parties in order to meet the stringent deadlines.

The UX team designed the complete first- and second-run detector setup flow and provided the create, edit, delete, and error states for each screen.

We also maintained InVision projects that served and a repository for all feedback and as the “source of truth” for the engineering team once designs were approved. We created one InVision project for the Beta and a second project for “fast-follow” features.

 
 

SUMMARY

 

As the Lead Experience Designer, I worked with key stakeholders to plan and execute the customer experience for this new service. I provided design leadership and direction for a multi-disciplinary client team, advocating for UI best practices across UX and design development.

The multi-disciplinary project team was not familiar with UX design methodology or best practices, so I shared templates and resources to aid in gathering business requirements and creating a design backlog. So I developed and maintained a responsive master component library (consistent with the existing client design system) to enable efficient and consistent design development.

We collaborated closely and each took ownership of specific pages and flows. We rapidly iterated on interactive wireframes under extremely tight deadlines and provided developer-ready files on schedule. I provided UX leadership through a rigorous internal design review and approval process, in addition to multiple executive reviews.

UX design and hand-off was completed on schedule. Once in Beta, I assisted the client team in preparing and conducting user validation testing, synthesizing results, and creating a prioritized implementation plan.