Build a due diligence analysis tool
with Machine Learning

Wells Fargo Innovation Group •  10 months (Jan-Oct 2018)

Overview

Wells Fargo was among the early adopters of emerging technologies in financial services. The Innovation Group and Financial Compliance Team partnered to explore how AI could help their business use. This is a Proof-of-Concept (POC) project focused on the heavy lifting, automating the time-consuming manual data gathering and aggregating process, so the compliance analysts can use their expertise better.

My role

I was the lead designer of the due diligence analysis tool POC from January 2018 to October 2018.

During the first 5 months, I interviewed 20+ Line-of-Business (LOB) stakeholders at the Finance Compliance Team, including compliance managers, team leads, analysts, and Global Product Team members. 

I also worked alongside 2 Product Managers (PM) and 1 Product Owner (PO) while the IT team and 5 Data Scientists  built the constructed data sets.

During the last 5 months, my process was iterative – brainstorming and sketching via video calls, reviewing designs with PMs and PO, refining requirements, and conducting User Acceptance Testing.

The POC laid the groundwork for a promising enterprise Artificial Intelligence (AI) solution. Wells Fargo’s AI Enterprise Solutions Team picked up the POC for production in October 2018. 

The goal

The intent is to design a web-based application to help analyze international wire transactions. It would use Machine Learning (ML) capabilities to identify unique financial events or behavioral patterns, a.k.a. "anomalies.”

Using ML can help detect and re-prioritize transactions potentially suspicious. It could greatly reduce the number of high-risk cases handed over to the compliance analysts to investigate. So, the analysts can focus on matters requiring a closer look, and can do their job faster, and more accurately.

Ultimately, the POC could be the foundation of applications on a larger scale with more robust functionalities. For example, we can use a similar approach to fraud detection, anti-money laundering, etc. 

For long term, the POC can help lower the risk of potential financial losses and reputation consequences.

The process

Discover pain points

Starting the project by interviewing the LOB stakeholders was critical. My PM and I learned about how the finance compliance team members worked together. We picked up a lot of Anti-Money Laundering (AML) terminology and acronyms. We were overwhelmed with the amount of data the analysts had to dive into on a regular basis. We documented their working routines and rhythms. We felt their pain and established hypotheses about where they could use the most help.

In other words, we defined opportunities for how this POC could be the most meaningful to the analysts.

user journey map documenting the existing workflows of financial compliance analysts.

Define focus

Mapping out the user journey helped to ensure we considered the end-to-end experience. 

We found that the repetitive routine tasks, e.g., data sorting, currently done by humans alone, were time-consuming. The analysts could better use their expertise for value-added activities, e.g., data analysis and writing reports.

Create persona

We identified the compliance analysts were the primary users for the tool. I created a proto-persona based on the stakeholder interviews. Our subject matter experts kept us honest that the fictional character was a proper representation of them.

aedd-2-persona.png

proto-persona as reminder – "always walk in the analysts' shoes

Design with people top of mind

The proto-persona ensured all stakeholders on the project were aligned - “We don't design for you (executives). We design for the analysts.” It was crucial to remind the product and design teams - always walk in the analysts’ shoes.

Build Machine Learning into existing behavior

Introducing the concept of “confidence level/score” to the analysts was a challenge. I explored different options to inject the ML element into the analysts’ existing workflows of manipulating the data. I was determined to find the least intrusive way to add this new element to their current way of working.

iterations in low fidelity kept me and product managers focus on the design elements instead of the visual design.

Classify and recommend with certainty

Our goal was to deploy an automated experience in two areas:

Classify and correct legacy data. Think sorting clothes by care instructions before washing for example. If ML could pre-sort those with clear care instructions, the analysts would only need to confirm the false-positives.

Recommend when data is not eligible for auto-correction. Let's use the laundry metaphor again. If one is unsure how to clean certain clothing items, ML could give care recommendations. The analysts then can accept or reject the recommendations with their tribal knowledge. They could “teach” the system for smarter processing in the future.

Low fidelity to brainstorm

Low-fidelity wireframes were essential during the exploratory phase for design. I was able to make changes on-the-fly at the brainstorming sessions with product managers and the product owner.

The fidelity of the mockups started to improve as the design clarity increased.

Iterate to refine

As the fidelity of wireframes improved, I checked in with stakeholders and data scientists regularly. 

Seeing the mockups in a clickable prototype made it easy for everyone involved to visualize the end-to-end experience. 

We also realized the original intent of building a comprehensive reporting tool was too ambitious. We narrowed down the project goal on a much simpler tool that focuses on cleansing data, faster and "smarter."

Final iteration

aedd-5-hifi1.png
aedd-6-hifi2.png