Enterprise AI Β· UX Case Study

Me@Walmart:
Building the Corporate Super-App

How I independently led UX strategy to transform 200 bookmarks and 12 fragmented HR systems into a single, AI-powered people experience β€” designing for trust, measuring outcomes, and defending research against org pressure at Fortune #1 scale.

RoleLead UX Designer
PlatformMobile + Desktop (iOS, Android, Web)
DomainEnterprise AI Β· HR Tech Β· Super-App
Team3 Sr. UX Designers Β· 1 Researcher Β· 3 Scrum Teams
Duration10+ months Β· Multiple releases
01 Β· Project Brief

A Fortune #1 company, drowning in 200 bookmarks

Walmart's Home Office associates β€” managing teams, approvals, onboarding, and career development β€” did it all across a fractured landscape of 12+ disconnected enterprise tools. The cost was measurable, the frustration was real, and nobody had solved it.

The Scale of the Problem

Every manager started their day not by managing their team β€” but by figuring out which of 12 systems held the information they needed. Workday for HR tasks. ServiceNow for IT. Concur for expenses. DocuSign for approvals. GuardianVantage for benefits. ULearn for training. Talent Marketplace for hiring. Plus SharePoint, Teams, OneWalmart, and a folder of 200+ bookmarks.

2 hrsproductivity lost daily per associate
12+disconnected systems in daily use
βˆ’100initial NPS (Drop 0 baseline)
67%initial CSAT β€” needs improvement

Managers across the organization are busy and have a difficult time navigating complicated systems to be their very best selves for their teams.

β€” Me@Walmart Product Problem Statement

"People are just operating on information they learned from their manager... I need to know how to protect myself and Walmart, and if there are gaps in my knowledge to do that, that's my biggest concern."

β€” Home Office Manager, User Research Interview

"I created a spreadsheet to make sure I have quarterly feedback sessions, career discussions, one on ones... I don't want anybody coming back at the end of the year saying you haven't even met with me."

β€” Eric Burnett, Home Office Salaried Manager (Persona Research)
My Role

Principal UX Designer

I led UX strategy and vision independently for Me@HomeOffice β€” Walmart's SPIE (Single Integrated People Experience) super-app β€” including the flagship My Team mini-app, the AI Chat system, the Integrated Inbox, and the POI (Point of Interest) Framework. Three senior designers supported execution; the strategy, research program, AI workflow architecture, and design system principles were mine to define end-to-end.

UX Strategy AI Workflow Design Research Program Design Systems 4-in-the-Box Leadership AI Governance
πŸ—οΈ

4 Enterprise Systems Consolidated

Workday Β· ServiceNow Β· Concur Β· DocuSign β€” plus 8+ additional mini-apps β€” all unified under one AI-powered hub with shared data layer and single sign-on.

Estimated Business Value

At Walmart's scale, recovering 2 hours of daily productivity per associate across even a fraction of the 1.6M+ associate base represents billions in annual enterprise value. This wasn't a UX project β€” it was a business transformation with design at the center.

92.5SUS Score β€” "Excellent" (My Team)
2 hrsDaily productivity recovered per associate
62.5%AI assistant adoption rate (Ask Sam)
19Workday use cases automated via AI chat
73β†’92.5SUS score progression across drops
02 Β· User Personas & Real Voices

Designing for real people, not user types

Every design decision was grounded in the lives of two distinct archetypes surfaced through 1:1 interviews, diary studies, and card sorting. Their frustrations weren't hypothetical β€” they were documented, quoted, and used to defend every design choice in stakeholder reviews.

πŸ‘©β€πŸ’Ό
Kelly / Eric β€” The People Leader
Sr. Manager Β· Home Office Β· Manages 8–15 directs Β· x5–x9 level
Bio

A new or established people leader dedicated to enabling their team's growth. But too many systems, too many meetings, and too little time get in the way of actually doing the job they were hired to do.

"I am big on enabling my team to go execute so I have the ability to go design processes and run interference. But too many places to go for resources get in the way."
What's In It For Them

A simple, connected experience to manage My Team, My Wellbeing, and My Career.

Daily Pain Points
Processes a transfer in 20+ steps across 3 systems β€” for one person
Builds Excel spreadsheets to track 1:1s because no tool does this
Receives approval requests from Workday, ServiceNow, Concur, and DocuSign β€” all separately
Spends first 45 mins of the day just figuring out what needs their attention
Goals
Build leadership brand Support team growth Identify mentors organically Keep leadership aligned
Me@ Impact
My Team hub β€” all people, one view AI Inbox β€” one priority queue 2-step transfer via AI Chat
πŸ‘¨β€πŸ’»
Jason β€” The Individual Contributor
Associate Β· Home Office Β· New hire Β· Individual Contributor
Bio

Motivated, ambitious, and immediately overwhelmed. Onboarding checklist in an email. Benefits in Guardian. Career goals in a Workday field nobody told him about. There is no compass, and he doesn't know how to ask for directions.

"Where should I even begin??? I have Office 365, ULearn, OneWalmart, Guardian, Workday, Enboarder, ServiceNow, Talent Marketplace β€” all open at once."
Daily Pain Points
No centralized onboarding progress view β€” doesn't know what he's missed
Forgot PIN code; VPN not working on personal device β†’ locked out entirely
Thinks the AI chatbot is Zoom or Teams β€” never uses it for actual tasks
Career and learning content siloed in ULearn and Talent Marketplace with no integration
Goals
Navigate onboarding confidently Find career paths Access benefits easily Connect with right people
Me@ Impact
Guided onboarding journey Digital compass for career SSO β€” no more PIN reset
❌ Before Me@ β€” Kathy's Story

Having a baby? Navigate this yourself:

πŸ”΄Opens Office 365. Searches for maternity leave policy. Gets a 40-page PDF.
πŸ”΄Opens Guardian for benefits. Different login. Session times out.
πŸ”΄Opens OneWalmart for HR contact. Finds 3 phone numbers.
πŸ”΄Opens Workday to file leave. Doesn't know which form to use.
πŸ”΄Asks her manager. He doesn't know either. Asks his manager.
πŸ”΄40+ minutes later β€” still not sure if leave is correctly filed.
βœ… After Me@ β€” The Same Journey

One app. Proactive. Personalized.

🟒Day 1: "You have 9 weeks of paid maternity leave by your due date β€” plus 6 optional weeks of paid parental leave."
🟒Day 15: "Would you like help finding in-network healthcare providers in your area?"
🟒Day 180: "Would you like to set up a monthly wellness check-in with HR?"
🟒Day 210: "I've marked you as OOO from your due date. All approvals are routed to your backup."
🟒Total time: under 2 minutes. Zero system-switching. Full confidence.
03 Β· UX Strategy & Design Breakthrough

From "Portal of Links" to Transactional Intelligence

The breakthrough wasn't adding a new interface. It was fundamentally rethinking what a people platform could be β€” from a passive directory to a proactive AI assistant that anticipates needs, executes transactions, and guides associates through their most vulnerable moments.

1

Discover Β· Find Β· Do β€” The POI Framework

I invented the Point of Interest (POI) Framework β€” a "Discover, Find, Do" model for campus navigation. Instead of just showing a meeting room, the app anticipates the user's journey: notifying when to leave, providing turn-by-turn directions, and letting associates order coffee on the way.

2

AI-Powered Conversational Assistant (Me@ Chat)

Transformed 20-step Workday processes into 2-step mobile flows. Designed a conversational AI that executes real transactions across 19 Workday use cases β€” with human-in-the-loop verification gates and NLP exception handling via Google Dialogflow.

3

Integrated Inbox β€” Cross-Platform Priority Queue

Designed a unified approval hub consolidating Workday, ServiceNow, Concur, and DocuSign. AI surfaces and prioritizes the most pressing items, eliminating context-switching and cognitive overload.

4

My Team β€” Associate-Centered Manager Hub

Rebuilt the manager experience around an associate-centered mental model (not task-centered), allowing managers to understand, support, and act on behalf of their team members from a single view.

5

Scalable Mini-App Architecture

Designed a modular framework so new experiences (Learning, Career, Health, Financial Well-being) can be deployed seamlessly without disrupting the core ecosystem β€” true platform-level design thinking.

The Core Insight

Associates didn't need a better dashboard. They needed a digital compass β€” something that points them toward the right action at the right moment, whether that's their next career step, a pending approval, or directions to their next meeting.

I reframed the product from a "people portal" to a "personal assistant" β€” and designed every interaction to validate that promise with evidence.

🧭

The Full Manager Journey Framework

I mapped the complete people-leader lifecycle across 7 stages: Find (recruit talent) Β· Welcome (onboarding) Β· Connect (team communication) Β· Engage (goals, feedback) Β· Grow (learning) Β· Further (career growth) Β· Manage Me (personal wellbeing). Every Me@ mini-app maps to one of these stages.

Design Breakthrough

By shifting from a task-centered to an associate-centered mental model, I unlocked a design architecture where managers find everything about a team member in one spot β€” reducing decision latency from minutes to seconds, and driving the SUS score from 73 to 92.5.

04 Β· Competitive & Analogous Research

What the best super-apps in the world taught us

Before defining the architecture, I conducted industry analysis benchmarking Me@ against leading consumer super-apps, enterprise HR platforms, and corporate composite apps. The goal: identify what "excellent" looks like in each dimension of the product so we could exceed it.

Workday
Best-in-class HR data depth but severely task-oriented UX. Managers hate navigating it. Insight: data richness without UX is a burden, not a feature.
Learned: What NOT to copy
WeChat
Defined the super-app concept β€” payments, messaging, services, maps in one container. Insight: modular mini-apps with consistent UX shell is the winning architecture.
Inspired: Mini-app model
Grab / Gojek
Proactive suggestions based on time, location, and behavior. Insight: the interface that anticipates needs before the user asks is 10x more valuable than one that only responds.
Inspired: POI proactivity
Apple HR Apps
Best-in-class mobile-first people management. Insight: mobile isn't a reduced desktop β€” it's a different mode with different affordances and urgency levels.
Inspired: Mobile-first logic
Slack + AI Assistants
Users trust AI when it's scoped and transparent. Open-ended "ask anything" AI creates anxiety in enterprise. Insight: structured AI with clear capability boundaries builds adoption.
Learned: AI trust model

The best consumer super-apps in the world succeed because they eliminate the question "where do I go for this?" We applied that same principle to enterprise HR β€” and the 92.5 SUS score proved it works.

β€” Design Strategy Rationale, Me@HomeOffice SPIE
05 Β· Business Goals

Unlocking potential at enterprise scale

The business goals matched the ambition of the company β€” reducing friction for 1.6M+ associates while building an AI-driven platform that could grow modularly for years.

πŸš€

Unlock Associate Potential

Empower every associate to realize their potential through well-being support, career navigation, and proactive guidance embedded in their daily workflow.

⚑

Operational Efficiency at Scale

Recover 2 hours of daily productivity per associate β€” an enormous ROI at 1.6M+ scale. Reduce searching costs, switching costs, and cognitive load simultaneously.

🧩

Modular Scalability

A mini-app framework where Learning, Career, Health, and Financial Well-being experiences can be deployed without disrupting the core architecture.

🀝

Manager Enablement

A single anchor point for people leaders to support their teams through onboarding, transitions, feedback, and performance β€” without Workday navigation.

🧠

AI-Driven Intelligence

Leverage ML to surface the right information, the right approval, and the right career nudge before associates have to search β€” anticipate, don't just respond.

πŸ’‘

Associate Listening Culture

Reinvigorate Walmart's feedback culture through MyIdeas β€” a direct channel for associates to contribute ideas, save time, and earn meaningful recognition.

06 Β· UX Priorities

What I focused on β€” and what I deferred

With 12+ systems to consolidate and hundreds of use cases to support, focus was everything. These were the non-negotiable UX priorities I established, ranked, and defended through every design drop.

1

Radical Cognitive Load Reduction

Consolidate all actionable items into a single AI-prioritized Integrated Inbox. Associates should never check five systems to know what needs their attention today.

2

Associate-Centered Predictability

Every manager finds everything about their team member in one spot. Research revealed managers operate with people β€” not tasks β€” as their primary mental model.

3

Proactive Just-in-Time Guidance

Use AI nudges to surface career opportunities, financial wellbeing tips, and onboarding checklists exactly when associates need them β€” not buried in a menu.

4

Human-in-the-Loop AI Controls

Every AI workflow includes verification checkpoints β€” Submit, Back, and Confirmation screens β€” so associates feel in control before committing to irreversible actions.

5

Mobile-First, Desktop-Accurate

Mobile for time-sensitive approvals and quick checks. Desktop for high-precision, data-heavy tasks. No feature designed for one platform and bolted onto the other.

Research Methods I Ran

πŸ—£οΈ

1:1 Virtual Interviews (Zoom Β· 1hr Β· 6–11 participants per round)

Mental model mapping, pain point excavation, task-based testing

πŸ““

Diary Studies (2 weeks Β· contextual app usage)

Longitudinal behavior tracking β€” when, where, how the app was actually used

πŸƒ

Card Sorting (IA Validation Β· open + closed hybrid)

Aligned information architecture with user mental models for profile, career, and financial data

πŸ§ͺ

Iterative Usability Testing (SUS Β· SEQ Β· task completion)

Multiple drops; SUS scored each iteration to track improvement

πŸ“Š

Mixed-Methods Survey (NPS Β· CSAT Β· UX Lite Β· quantitative + qual)

Established digital experience baseline for each product drop

πŸ”

The Pivotal Research Finding

Managers don't think in tasks β€” they think in people. When I restructured My Team around individual associates instead of action categories, task success rates jumped and the SUS score climbed 19.5 points. One research insight, one architectural decision, measurable outcome.

07 Β· Problem, Opportunity & Drivers

The real problem wasn't missing features β€” it was fragmentation

Root Causes Uncovered by Research

Platform Burnout

Associates drowning in 200+ bookmarks and 12+ tools, creating cognitive overload and job-related stress.

The Productivity Gap

Up to 2 hours of productivity loss per associate per day from context-switching and searching costs.

Manager Friction

Leaders couldn't support their teams during vulnerable moments β€” onboarding, exits, transfers β€” without navigating Workday's complexity.

AI Trust Gap

100% of users in one test round didn't understand the chatbot's purpose β€” many thought it was Zoom or Teams.

Technical Trust Erosion

PIN code failures, VPN requirements, slow loading, and duplicate data entries in Drop 0 actively discouraged adoption.

How Might We… (Research POVs)

πŸ’¬ How might we make AI feel like a real work assistant, not a messaging chatbot? πŸ“₯ How might we ensure managers always know which actions are waiting for them? πŸ‘€ How might we help managers understand a team member at a glance? 🧭 How might we surface the most relevant content for each associate's role and tenure? βš™οΈ How might we consolidate approvals across Workday, ServiceNow, Concur, and DocuSign? 🎯 How might we make onboarding a guided journey, not a checklist in an email? πŸ”’ How might we eliminate PIN/VPN friction without compromising InfoSec requirements?
08 Β· Tradeoffs & Hard Decisions

The choices that defined the product

At Principal level, the most important design work happens in the choices you don't make. Here are five moments where I had to choose a direction, defend it with research data, and live with the organizational consequences.

1
AI Inbox: Full Automation vs. Human-Verified Approvals
❌ Considered
Fully automated AI approvals β€” system processes and submits directly. No confirmation step. Frictionless, fast.
VS
βœ… Chosen
Human-in-the-loop: AI prepares the action, manager reviews and explicitly confirms before any submission.
Why: Usability testing showed users explicitly wanted a "Submit button to double check and prevent mistakes." At enterprise scale, a mis-approved transfer affects an associate's livelihood. The confirmation tap costs nothing; the trust cost of a wrong AI action is devastating. This "Confidence Gate" pattern directly contributed to the 92.5 SUS score β€” users felt in control.
2
Chatbot Design: Open-Ended Conversation vs. Guided Task Selection
❌ Initial Direction
Free-form chat β€” associates type anything, NLP interprets intent. Mirrors consumer AI (ChatGPT style).
VS
βœ… Redesigned To
Structured task-flow with pre-defined categories + guided steps. NLP handles intent disambiguation within a scoped action set.
Why: In the first usability test, 100% of users didn't understand the chatbot before clicking it β€” many assumed it was Zoom or Teams. Open-ended conversation amplified this confusion. Structured guidance communicates capability upfront. This tradeoff cost us some "wow" factor but gained adoption β€” 62.5% usage rate validated the approach. The AI rebranding from "chatbot" to "ActionBot / Workday Assistant" resolved the identity confusion entirely.
3
My Team Architecture: Task-Centered vs. Associate-Centered Mental Model
❌ Original Architecture
Task-centered: "Approvals," "Transfers," "Offboarding" as top-level navigation. Mirrors Workday's structure.
VS
βœ… Redesigned To
Associate-centered: each team member is the entry point. Kelly opens Jason's profile β€” sees everything about him, all actions available, in one screen.
Why: Card sorting and interview research revealed a critical mental model mismatch: managers don't think "I need to do a transfer" β€” they think "I need to help Jason." Engineering initially pushed back on complexity (one associate view pulls from 4+ systems). I defended it with research data in a 4-in-the-Box session. The redesign shipped in Drop 2. SUS score: 92.5 vs. a projected 70-range with the original architecture.
4
Homepage Strategy: Curated "What's New" vs. ML-Driven Personalization
❌ Drop 0 Approach
Editorial homepage β€” "What's New" content, manually curated. Fast to ship, zero ML complexity.
VS
βœ… Advocated For
ML-driven home screen ranked by role, tenure, and behavior. Month-1 associate sees onboarding nudges. VP sees team health insights. Same interface, personalized.
Why: Research showed users were frustrated the homepage showed "what's new" instead of what was most relevant to them. A generic homepage actively erodes trust β€” if the app doesn't know you, it doesn't feel useful. I advocated for the ML approach with a phased roadmap balancing delivery velocity with long-term product quality. The "test and learn" personalization approach was approved by Product leadership.
5
My Org Feature: Stakeholder Priority vs. Research Reality
❌ Stakeholder Request
Keep "My Org" as a prominent primary nav feature in My Team β€” it was explicitly requested by business stakeholders and considered a differentiator.
VS
βœ… Research-Backed Decision
Deprioritize standalone My Org. Redesign as contextual view within individual associate profiles. Add search bar. Surface it where managers need it, not as primary nav.
Why: 4 of 5 usability testing participants didn't understand who the people in My Org were or why it existed β€” they called it "a distraction." I presented direct quotes, task failure rates, and a proposed IA redesign to stakeholders. The feature shipped in its redesigned form in Drop 2. Post-launch testing on My Org tasks: 95 SUS. The hardest meeting of the project β€” and the most vindicated.
09 Β· Cross-Functional Leadership

UX as a strategic function, not a service team

At Principal level, design influence extends far beyond pixels. I operated as a strategic partner across Business, Product, Engineering, HR, Legal, and AI/ML β€” advocating for users when priorities conflicted, aligning teams around shared principles, and leading decisions that required organizational courage.

πŸ— Engineering Β· 4itB

Defending the Associate-Centered Data Model

Engineering proposed a task-based API (actions as top-level endpoints) matching Workday's architecture. I pushed back with research showing managers think in people, not tasks. Led a joint design-engineering workshop to redesign the data contracts around associate profiles as the aggregating entity.

βœ“ Re-architected data layer shipped Drop 2 β†’ enabled 92.5 SUS
πŸ“‹ Product Β· 4-in-the-Box

Blocking a Confusing Feature from Shipping

Product stakeholders wanted My Org as a high-visibility Drop 1 feature. Research showed 4/5 users didn't understand it. I prepared a findings presentation with usability quotes, task failure rates, and a redesign proposal. Successfully deferred the original scope to Drop 2 with a research-validated replacement.

βœ“ Avoided shipping a confusing feature Β· Drop 2 My Org: 95 SUS
🀝 HR / People Tech / Legal

Translating HR Policy Into Interaction Design

The "Exiting an Associate" workflow required deep collaboration with HR and Legal to map policy rules (voluntary vs. involuntary exit, remote vs. in-office, final paycheck timing) to UX decision trees. Designed compliant, humane guided journeys during a vulnerable moment for both manager and associate.

βœ“ Compliant guided exit workflow Β· zero escalations post-launch
πŸ”¬ Research Operations

Building the Research Program from Zero

No standing UX research cadence existed when I joined. I established the full program: recruitment criteria, usability testing protocols, diary study design, card sorts, longitudinal surveys, and a findings repository that Engineering and Product actively referenced in sprint planning.

βœ“ SUS improved 73 β†’ 92.5 across drops via research-driven iteration
⚑ AI/ML · Dialogflow · NLP

Co-Designing AI Exception Handling

When NLP intent parsing failed in early Dialogflow logs, I worked directly with the AI/ML team to identify model vs. UX failures. Redesigned confirmation and disambiguation flows that reduced user-facing errors while giving the model cleaner training signals from structured user responses.

βœ“ Chat abandonment reduced Β· 62.5% AI task completion rate
πŸ” InfoSec Alignment

Governing AI for Production-Grade Security

Ensured the AI's data access strategy was reviewed and approved by Walmart InfoSec before shipping. Led tone governance work with the Living Design (LD) playbook team to define communication boundaries for high-stakes AI tasks like employee exits and demotions.

βœ“ AI shipped with full InfoSec approval Β· tone framework adopted org-wide

⚑ The Hardest Organizational Moment

Engineering estimated the associate-centered data model would add 6 weeks to the Drop 2 timeline. Product wanted to revert to task-centered to ship on schedule. I facilitated a three-way session with Engineering, Product, and a senior HR stakeholder β€” presenting research data and projecting what a task-centered model would score in usability testing. The 6-week extension was approved. The 92.5 SUS score vindicated it.

How I Operated at Principal Level

β†’

Led the 4-in-the-Box partnership: Business Β· Product Β· Engineering Β· Design

β†’

Presented findings directly to VPs and Sr. Directors to influence roadmap

β†’

Directed 3 Sr. UX Designers with creative direction + quality review

β†’

Created Confluence design libraries adopted as standards across the SPIE program

β†’

Facilitated cross-team design critique sessions to maintain quality across mini-apps

β†’

Managed vendor integration guidelines for 4 partner enterprise platforms

Scope of Influence

4+

Partner engineering orgs aligned

6+

Research studies designed & run

3

Sr. Designers directed

5+

Product drops influenced by UX strategy

10 Β· Measuring & Optimizing AI-Driven Outcomes

Design-led AI that's measured, not assumed

I operationalized a rigorous measurement framework to prove AI value β€” not just in sentiment scores, but in behavioral engagement, task completion, NLP exception analysis, and iterative optimization loops across every product drop.

AI Design Philosophy: "Safeguarded Automation"

The Me@ Chat wasn't built to feel smart β€” it was built to be trusted. Using Google NLP and Dialogflow, I designed a feedback loop where every exception in user interactions became data for the next design sprint. Confidence Gates (submit screens, back navigation, in-flight checklists) transformed risky automation into a trusted digital assistant. Goal: transform 20-step Workday processes into 2-step mobile conversations.

92.5
SUS β€” My Team
89
SUS β€” Mobile SPIE
62.5%
Ask Sam usage
19
AI use cases
73β†’92.5
SUS progression

Measurement Framework

🎯 SUS Score Progression

Tracked SUS every drop: 73 (Drop 0 baseline) β†’ 89 (SPIE mobile) β†’ 92.5 (My Team). Each improvement was tied to specific design changes surfaced by research, creating a closed evidence loop.

πŸ“ˆ Behavioral Engagement Tracking

WAU (Weekly Average Users) and feature-level usage rates. Ask Sam AI: 62.5% vs. Manager Approvals App: 1.3% β€” this contrast directly informed next sprint priorities. Low-usage features got redesigned, not buried.

🧠 NLP Exception Analysis

Analyzed Dialogflow conversation failures with AI/ML team. Each "exception" β€” where the AI misunderstood intent β€” became a design prompt for refining dialogue flows and task categorization in the next iteration.

πŸ’¬ In-Task Feedback Capture

Designed feedback prompts at the end of each AI use case, creating real-time signal loops between user experience and engineering optimization β€” a continuous learning pipeline, not a one-time evaluation.

Business & User Metrics

92.5
SUS Score

My Team β€” rated "Excellent" (industry top 10%)

2 hrs
Productivity Target

Daily per associate via Integrated Inbox

62.5%
AI Feature Adoption

Ask Sam usage β€” validating the AI value prop

βˆ’100
NPS Baseline

Drop 0 β†’ continuous improvement target per drop

67%
CSAT (Drop 0)

Initial satisfaction baseline with improvement roadmap

WAU
Weekly Active Users

Primary engagement KPI tracked per feature per drop

11 Β· UX Guidelines & Principles

The rules I wrote β€” and defended

These weren't suggestions. They were guardrails I established to ensure every design decision across mini-apps stayed consistent, scalable, and purposeful β€” even as the team grew and new features were added by other designers.

🎯 Action or Benchmark-Oriented

Every insight must lead to a clear, actionable next step. If a user can see a metric but can't do anything about it, it's noise β€” not UX.

πŸ“ Scale Content, Not Complexity

Add value without overwhelming the UI. More features never means more interface complexity. Ruthlessly prioritize what's visible at any moment.

βš›οΈ Atomic / Object-Oriented Design

Promote reuse and consistency across the super-app ecosystem. Every component is designed once, used everywhere β€” coherence at scale.

πŸ“± Mobile-First, Desktop-Accurate

Mobile for quick checks and time-sensitive approvals. Desktop for high-precision, data-heavy tasks. Never compromise one platform for the other.

πŸ€– Human-in-the-Loop by Default

Every AI workflow includes a confirmation step. Associates must feel in control before the system takes irreversible action on their behalf.

🧩 Personalization Without Configuration

Use ML to surface relevant content by role, tenure, and behavior β€” but never make the user configure the intelligence. It should just work.

12 Β· Design System Contribution

Platform-level thinking β€” not just feature design

One of the most overlooked Principal-level contributions on this project: I defined the design system foundation that enabled all subsequent mini-apps to ship with consistency, speed, and coherence. These weren't design decisions for one screen β€” they were decisions for the entire platform.

βš›οΈ

Atomic Component Library

Cards, pills, action items, and inbox patterns designed for reuse across all mini-apps. Adopted by 3 supporting designers.

πŸ”€

Typography Hierarchy System

Label Β· Body Β· Title Β· Display hierarchy applied consistently across mobile and desktop experiences.

🎨

AI Interaction Patterns

Standardized confirmation gates, back-step navigation, disambiguation flows, and feedback prompts for all AI workflows.

πŸ“±

Mobile/Desktop Breakpoint Standards

Defined which interactions belong on mobile vs. desktop, preventing the "bolted on" feel of single-platform designs.

πŸ—ΊοΈ

Navigation Shell Framework

The modular mini-app container architecture that allows new experiences to be deployed without disrupting the ecosystem.

πŸ“‹

Confluence Design Guidelines

Documented standards adopted as the official Me@SPIE design specification across the entire program.

Why This Matters at FAANG Level

Principal designers at FAANG aren't measured by how many screens they designed β€” they're measured by how much leverage their work creates. A design system that enables 3 designers to work at the quality of 10 is a 3x force multiplier.

The atomic design principles I established meant that every new mini-app (Learning, Career, Health) could be designed and shipped without reinventing interaction patterns β€” each launch was faster than the last.

13 Β· Accessibility

Inclusive design at enterprise scale

Accessibility was not a checklist item on this project β€” it was a design quality standard. With 1.6M+ associates of varying abilities using this platform daily, every accessibility improvement was a real impact at real scale.

πŸ‘οΈ

Low-Contrast UI Audit & Remediation

Identified multiple low-contrast text and icon combinations across the platform during design critique sessions. Documented WCAG 2.1 AA violations and worked with engineering to remediate across all affected screens before launch.

WCAG 2.1 AA Β· Visual
⌨️

AI Chat Keyboard & Screen Reader Accessibility

Designed the conversational AI flows to be fully navigable via keyboard, with appropriate ARIA labels for dynamic content updates. Screen reader users could execute Workday transactions without mouse interaction.

WCAG 2.1 AA Β· Motor + Screen Reader
πŸ“±

Mobile Accessibility β€” Touch Targets & Dynamic Text

Enforced minimum 44Γ—44pt touch targets across all interactive elements on mobile. Designed layouts to support Dynamic Type scaling without breaking information hierarchy, particularly in the Integrated Inbox approval cards.

Apple HIG Β· Android Material Β· Touch
πŸ””

Notification & Alert Accessibility

Designed notification patterns that worked across visual, auditory, and vibration modalities β€” ensuring time-sensitive approvals were perceivable by associates with hearing or visual impairments.

Multi-modal Β· Perceivable
14 Β· What I Delivered

Solving a Fortune #1 problem, end-to-end

Concrete deliverables and design outcomes β€” the kind that prove a Principal-level UX designer doesn't just execute screens, but shapes strategy, proves value with data, and builds systems that scale.

πŸ—

Built a Scalable Super-App Framework (from scratch)

Independently defined the UX architecture for Me@HomeOffice β€” a modular mini-app system consolidating Workday, ServiceNow, Concur, and DocuSign. Established design principles, pattern library, and interaction model adopted by all downstream teams.

πŸ“Š

Achieved a 92.5 SUS Score β€” Industry "Excellent" Tier

Through research-backed iterative design, drove the My Team mobile experience from a 73 (Drop 0 baseline) to 92.5 β€” placing it in the top 10% of enterprise software usability globally. Each improvement was traceable to a specific research finding and design decision.

πŸ€–

Designed an AI Conversational System Across 19 Workday Use Cases

Led UX for Me@ Chat β€” a conversational AI that executes real Workday transactions in 2 steps vs. 20. Designed verification flows, error handling, NLP exception loops, and feedback capture to build trust and drive adoption to 62.5%.

πŸ“₯

Engineered the Integrated Inbox β€” Cross-Platform Approval Hub

Designed the one-stop approval system for Workday, ServiceNow, Concur, and DocuSign. AI prioritizes the most pressing items. Target: recover 2 hours of daily productivity per associate through context-switching elimination.

🧭

Invented the POI Framework β€” "Discover, Find, Do"

Created the strategic campus navigation model for Me@Walmart Campus. Associates don't just find a room β€” they get context-aware guidance, directions, and services delivered proactively, anticipating the next need before it's expressed.

β™Ώ

Established Accessibility Standards Across the Platform

Conducted accessibility audits across the platform, identified WCAG 2.1 AA violations, and drove remediation with engineering. Established accessibility as a design quality standard β€” not a post-launch checklist β€” for all subsequent drops.

15 Β· Design Artifacts

The work, up close

Key design artifacts from the Me@Walmart SPIE project β€” wireframes, journey maps, AI flow diagrams, research synthesis boards, and annotated screens. Each artifact documents a specific design decision, iteration, or research insight.

πŸ’‘ How to use this section: Replace the placeholder cards with your actual Figma screens, wireframes, journey maps, or prototype screenshots. Ideal artifacts: (1) My Team before/after IA wireframes, (2) AI Chat conversation flow with annotations, (3) Integrated Inbox interaction spec, (4) POI Framework journey map, (5) Usability testing highlight reel or affinity diagram, (6) Design system component sheet.
πŸ—ΊοΈAdd your artifact
Journey Map
Manager Onboarding Flow β€” End-to-End
Before vs. After Me@ β€” 8 systems β†’ 1 guided flow
πŸ“±Add your artifact
Wireframe Β· Mobile
My Team β€” Associate-Centered Redesign
IA shift from task-centered to associate-centered
πŸ€–Add your artifact
AI Flow Β· Conversation Design
Me@ Chat β€” Transfer Task Flow
20 steps β†’ 2 steps with HITL verification gate
πŸ“₯Add your artifact
Visual Design Β· Mobile
Integrated Inbox β€” AI Priority View
Cross-platform approvals: Workday, ServiceNow, Concur, DocuSign
🧭Add your artifact
Framework Diagram
POI Framework β€” Discover Β· Find Β· Do
Campus navigation strategy β€” context-aware service discovery
πŸ“ŠAdd your artifact
Research Synthesis
Usability Testing β€” Drop 1 vs. Drop 2
SUS progression Β· Key insight clusters Β· Design responses
πŸƒAdd your artifact
Research Β· Card Sort
IA Card Sort β€” Associate Profile Taxonomy
Mental model alignment for financial, career, and team data
πŸ–₯️Add your artifact
Visual Design Β· Desktop
Me@ Homepage β€” ML-Personalized Dashboard
Role-aware, tenure-sensitive content surface

οΌ‹

Add More Artifacts

Drop in Figma links, annotated screens, prototype previews, or accessibility audit screenshots.

16 Β· Retrospective β€” What I'd Do Differently

What the project taught me

A Principal designer who can't reflect critically on their own work isn't growing. Here are five honest lessons from Me@Walmart β€” the things I'd approach differently, the constraints I'd challenge earlier, and the bets I'd make again without hesitation.

πŸ” Ship Research Findings Faster

Our research-to-design cycle was thorough but slow. In hindsight, I'd establish a weekly "insight standup" with Product and Engineering to share findings in real-time, rather than batching them into reports. Speed of insight β†’ speed of iteration.

⚑ Challenge the Drop Model Earlier

The "Drop 0 β†’ Drop 1 β†’ Drop 2" release cadence created pressure to ship features before they were research-validated. I'd push earlier for a "validated learning" gate before any feature enters a drop, making research an explicit input to sprint planning rather than a parallel track.

βœ… Earlier InfoSec Involvement Pays Off

Involving InfoSec in AI design earlier β€” at the wireframe stage, not post-prototype β€” would have saved significant rework on the AI Chat security review. Security is a design constraint, not a launch gate. I'd make InfoSec a standing member of the 4-in-the-Box for any AI workflow.

β™Ώ Accessibility From Day One, Not Day Thirty

Accessibility audits caught issues in the design review phase rather than at conception. In future projects, I'd integrate accessibility constraints into the design brief itself and test with assistive technology users in early-stage research, not only in final usability rounds.

πŸ€– Define AI Success Metrics Before Building the AI

We established NLP performance metrics mid-project. In hindsight, defining what "good AI" looks like (task success rate, exception rate targets, confidence thresholds) before engineering begins gives the whole team shared success criteria β€” and makes tradeoff conversations much faster.

πŸ“ˆ I'd Make the Same Big Bet Again

The associate-centered mental model redesign was the most controversial decision on the project β€” it added time, caused organizational friction, and required me to push back hard against stakeholders. The 92.5 SUS score proved it was right. I'd make that bet again, every time, with the same research behind me.

The biggest design lesson from Me@Walmart: research-backed conviction, delivered with intellectual humility, is the most powerful tool a Principal designer has. Not the prettiest prototype. Not the fastest ship date. The ability to say "the data says this, and here's why it matters" β€” and then be right.

β€” Personal Retrospective Β· Me@Walmart SPIE Program

Rashid Β· Principal UX Designer Β· rashid5777.com

Me@Walmart (SPIE) β€” UX Case Study Β· Walmart Global Tech Β· Available for Principal UX roles at FAANG & Enterprise AI companies