Designing a $350M Benefits Platform
Designing a platform used by 100+ partners to deliver $350M+ in public benefits.
2021 / 0->1 / Web and Mobile
Lead Designer
Project Overview
My Responsibilities: End-to-end design process which spanned research, ideation, customer validation, rapid prototyping and experimentation, through launch
Team: PM, Lead Engineer, Visual Designer, Customer Support Representative
Timeline: May - November 2019
Disclaimer: Certain visual assets have been intentionally blurred or omitted entirely due to the proprietary nature of this technology
What is Edquity?
Edquity is a technology provider that helps colleges improve students’ basic needs and financial security by increasing access to emergency resources and funding. To help students, including the 3 million who drop out every year due to a time-sensitive financial crisis of less than $500, Edquity provides partner colleges with a white-labeled mobile app and web platform where students can apply for emergency cash grants, discover emergency resources, receive personalized financial management, and more. I joined Edquity in 2017 as the first design hire and led our early research and design work as the sole designer on the team.
The Problem
In early 2019, the Dallas Community College District Foundation approached us to help them launch their first digital Emergency Aid Fund. Emergency aid is “just-in-time” financial assistance for students at risk of dropping out of college due to a financial hardship, like medical bills, car repair, a trip home to help an ailing parent, etc.
The traditional emergency aid process, as illustrated below, is high-touch and highly manual. So unsurprisingly it's filled with inefficiencies and human error/bias, which has serious compounding effects at both the student and administrator-level. In the current experience:

We asked ourselves how we could design an end-to-end process that was more equitable, efficient, and user-centered. In our re-imagined experience:

Constraints
Fixed launch date and limited resources meant we had to work backwards to make the aggressive scope of the project work with the timeline
Product architecture limitations meant we had to work within the existing architecture of our mobile-first product that was already being piloted at institutions across the country
Constantly changing requirements like new business requirements and last-minute customer requests. My design process had to be flexible enough to work in an environment in which uncertainty and change are constants, and timely decisions need to be made even when not all the variables are known
0-to-1 product development challenges like making decisions in the absence of analytics on app usage and user activity
Research
The tight timeline and the nuances of recruiting and conducting research with financially vulnerable populations precluded us from conducting upfront, exploratory research like we would have liked. We still, however, wanted to ground our design process in real user needs.
Luckily, we had a wealth of qualitative, empirical insights from extensive foundational research we’d conducted the previous year with the same target user population(s).

After spending a week revisiting past primary research, expanding on existing user personas, and conducting secondary research, we summarized the key pain points for our end-users (students) and our customers (the college administrators). In order for our design to be effective, we needed to solve for pain points on both sides.

Defining and Measuring Success
A critical part of my early design process is to establish a shared definition of success. I led a workshop with our stakeholders to collectively figure out how we would define and measure success. We defined some core design principles based on our understanding of our users, the problem, and the space. Defining design principles isn't something I always include in my design process, but it felt important to do so in this project to add additional visibility into the factors in my design decision‐making process and to galvanize our team around a shared vision of 'good' design. In setting success metrics, our goal wasn't to just design a marginally better experience — we wanted to set the bar really high.

Systems-Planning
Because this was the ground-up design of a new system, I needed to plan the system before I could start pushing pixels. I started constructing a diagram with the key flows, tasks, and processes.
Step 1: Blue-sky thinking — Outlined the envisioned end-to-end experience in a rough flow diagram without too much regard for limitations, constraints, feasibility, etc

Step 2: Fast feedback cycles — Conducted with our PM and engineer to figure out how the emergency aid experience fits into the existing user journey, where things might break, and flush out edge cases

After many iterations our rough flowchart evolved into a final, singular source of truth about the parameters, constraints, and flow of the system. Once we knew how all the pieces of the system fit together, we segmented it into three parts: pre-application, application, and post-application.
Going through the diagram with a fine-tooth comb, we started teasing out the nuances of the experience using the design principles we'd established earlier. I posed How Might We questions as a starting point for feature/functionality brainstorming and design exploration.

Narrowing Scope
Because of our limited engineering bandwidth and tight timeline, we had to develop a phasing strategy. I worked with stakeholders to run prioritization exercises to determine what would be included in the MVP and what would be prioritized in our product backlog.
‘Nice-to-haves’ and items for follow up exploration were captured in our product backlog to be prioritized for future sprints
Features without clear user/business value were tabled to revisit in the future when we had sufficient data to make ROI vs. level-of-effort calculations
Design Explorations
I focused my efforts on mocking up the highest priority flows to test. I typically identify 'confidence levels' to gauge what should be prioritized for testing. In this project, we were the most unsure about the flows for onboarding, application, and claiming funds, so that was where we focused our prototyping and testing efforts.


One of my design explorations was the layout for the home page. I explored numerous options before settling on two layout options: horizontal tab and card-based carousel.

After evaluating the pros and cons of each, my initial preference was the card option but there were lingering questions about whether there were enough affordances for users to intuitively know to swipe to see the other cards. I ultimately decided to test this with users because navigating to the emergency aid feature from the home page is a critical action.

After several rounds of sketching and iterating, I partnered with our visual designer to convert the whiteboard sketches and lo-fidelity mocks into high-fidelity screens for our prototype. My preferred approach is typically to prototype in mid-fidelity, but moving directly into high-fidelity was more time efficient because we could pull existing design patterns/components from our component library and quickly mock up screens.
User Feedback
We wanted to explore many design options without pressure to design the perfect layouts and visuals up front. Because many rounds of testing are a key part of our progress, we expected our designs to continually evolve. This benefits us in two ways: (1) we didn’t waste time making the early version pixel perfect, and (2) the time to test and iterate was built into our project timeline from the beginning. Our goal was to test the layout, language & tone, length and gauge ease of use, and comfort & trust levels.

I created the clickable prototype below to use for our first round of testing sessions.


Through our testing we identified three themes that informed the following design iteration and led to a more effective design:
1. Striking a balance between our internal data collection needs and the essential information for funding determination. Participants expressed that length of application would be a deterrent to starting and completing an application. This reinforced our design principles.
"I don’t feel like I’m not being asked any questions, but not too many. But if it was any longer, I'd probably be less motivated to finish it."
2. Given the high stress mindset users are in when completing the application it's crucial to build in guardrails and checkpoints to minimize information entry errors.
"When my kids are screaming in the background, it's hard for me to focus. So I like how there's only one question at a time, it makes it easier for me to focus when I can take things one step at a time."
3. Designing for the in-progress experience is critical because start-stop workflows are the norm for our user base. Participants expressed that given academic, work, and familial obligations, it was unlikely they would be able to complete the application in one sitting.
"There's no way I could finish this all in one sitting. I'd probably have to stop to go make dinner for the kids or something. How will I know where to pick up when I return to it?"
As a result of our findings, we made some critical design decisions:
Progressive disclosure to let users focus on one step at a time
Staged, conditional questions, limited to 1 per screen, to minimize demands on mental bandwidth
No write-in fields on the application. only discrete answer options
Progress indicators to show remaining sections and how close users are to completion
UI Considerations
I worked with our visual designer to convert our designs to high-fidelity. As new components were designed (for example, buttons, lines, type, hover styles, and tiles), they were added to a master component stylesheet. This process made creating specifications for developers efficient because there was no redundancy; we named each component on the page layouts, allowing development to reference the detailed spec in the master component stylesheet. If subsequent changes were made to one component in the stylesheet, these changes did not need to be reflected in new layouts.

Post-Launch
An analysis examining the short-term results of distributing emergency grants to students at a partner college in spring 2020 found that students who received emergency aid via Edquity were twice as likely to graduate by August, compared to comparable students who did not receive aid.
Impact Metrics
Disbursed $55M in critical cash assistance
Processed over 200,000 discrete applications
Funded over 100,000 students across the country
Partnered with over 50 institutions
Success Metrics
Time-to-completion (on average): 7 minutes
Time-to-decision (on average): Less than 24 hours
Time-to-disbursement (on average): Less than 48 hours
Increased retention: early data shows that the Edquity emergency aid app has increased the overall student retention rate in one of our partner colleges by 5 percent

Key Learnings
Brand identity and positioning is a critical consideration in B2B2C products
During the design process we spent a lot of time thinking through how we would position the Edquity brand. Luckily, we'd gone through a pretty extensive rebrand prior to the rollout of this project so we had clear guidelines for the Edquity brand identity, vision, and tone. Having a strong and cohesive brand helps inspire trust in users, which was of particular significance for this work-stream because we were prompting users to share sensitive and deeply personal information with us.
Standardizing the experience may mean pushing back to customers
We had to balance one-off customization requests from customers with the need to build a scalable, standardized product and experience. Moving forward, we solved for this by setting clear parameters about what can and cannot be customized. For example, onboarding language and payment options can be customized. The application questions, structure and scoring however, cannot be customized (to ensure equitable outcomes across different colleges).
B2B2C Designers must really embody what it means to be a user advocate
The Edquity app functions as the intermediary layer between customer (the college) pain points and end-user (the student) needs. As designers, the onus is on us to relentlessly advocate for our users. In practice, this meant weighing customer requests against user impact/experience implications and clearly communicate tradeoffs back to our customers. It took us a while to find the right cadence for that, but eventually we got to a point where we had engendered enough trust with our customers that they deferred to us about what was 'best' for their students.