Roga is a health-tech startup advancing Neurotech and AI to combat stress and burnout through a wearable device paired with a mobile app. The hardware was innovative, the app experience was lacking. Only 20% of users stayed engaged, and 40% returned their devices due to overwhelming onboarding, unclear value propositions, and confusing setup.
Working in a fast-paced startup environment, I designed the end-to-end mobile app experience through lean research synthesis, collaborative ideation, and building a scalable design system in Figma. We increased session engagement from 20% to 80% and reduced device returns from 40% to 15% - directly improving both user experience and business sustainability.

Company confidential information have been obscured in this case study due to a non-disclosure policy agreement.
A PROMISING PRODUCT
Roga had developed something genuinely innovative: a wearable neurotech device paired with a mobile app that delivers vagus nerve stimulation and mindfulness sessions to help users manage stress. The hardware worked and the science was sound, but the software experience was failing users at their most vulnerable moment: the first interaction.
The stakes were high. In healthtech, first impressions aren't just about aesthetics, they're about trust. When someone invests in a stress-management device, they're already stressed. An overwhelming app doesn't just frustrate; it reinforces the very problem the product promises to solve.
The Weight of Too Many Questions
Through previous testing data, reviews, and customer feedback, a clear pattern emerged. During stakeholder meetings, we validated these findings: the initial questions felt overwhelming, as if users were being interrogated rather than welcomed; a sentiment echoed repeatedly in feedback across channels.
"It's not really telling me what the objective of the app is. It would be better if I could tell that I can use it with or without the device. Lost the objective to really teach me what the point of the app is."
- User feedback
Beyond unclear purpose, users who did complete onboarding struggled to see their progress:
"It's what I expected. I felt like the program should be structured in a way to help you get better. How can you feel like you've had progress? At the top maybe say how many programs you've completed."
- User feedback
These weren't just UX complaints, they were business problems:
40% device return rate was unsustainable
Customer support couldn't scale with user acquisition
Only 20% engagement meant we were losing 8 out of 10 users
The disconnect between hardware quality and software experience was damaging brand perception
We didn't have the luxury of extensive months of user research. UX resources were thin. We were a small team moving quickly, which meant no time for lengthy field studies, limited budget for user testing, and pressure to show results fast.
How might we dramatically improve the experience using only the data and resources we already have?
Learning Fast with What We Had
Rather than starting from scratch, we turned to the goldmine we already possessed:
Previous Testing Data & Customer Feedback: Reviews and customer feedback revealed exactly where users got stuck: device pairing failures, confusion about the app's purpose, questions about progress tracking.
Previous User Interviews: Stakeholders had conducted user feedback sessions during device beta testing. We analyzed transcripts and notes to identify recurring themes and emotional pain points.
Existing App Audit: We conducted a comprehensive audit of the existing mobile app, evaluating every screen and flow to identify specific issues and noting recommendations to address them.
Stakeholder Perspectives: We met with the CEO and key team members to understand their expectations for the redesign and align on what success looked like from both user and business perspectives.
From these diverse sources, we identified three critical friction points: users couldn't understand what the app did (value proposition), felt overwhelmed by upfront questions (onboarding burden), and struggled to navigate content once inside (findability).
When we analyzed the data gathered and identified pain points, a pattern emerged: 33% were content-related issues, 30% involved value proposition and onboarding problems, and 17% centered on the device and overwhelming question patterns. This data shaped our recommendation to prioritize clarity of purpose, streamlined onboarding, and improved content organization.
From this analysis, we reframed the core problem:
How might we help users confidently understand the app's purpose and get to their first stress-relief session without feeling overwhelmed?
The presence of unclear value communication and excessive upfront questions was causing users to abandon before experiencing any benefit. We needed to focus on clarity and speed, establishing trust through understanding, then accelerating time to value whilst gathering essential information without feeling burdensome.
Given our constraints, we made deliberate choices about what not to tackle in v3:
We deferred advanced personalization features (they require data we didn't have yet)
We kept existing session types rather than creating new content
We postponed social and sharing features to focus on core individual experiences
This allowed us to ship faster and validate fundamental improvements before adding complexity.
To get the entire team invested in solving these challenges, we designed and facilitated individual ideation spaces in Figjam. This asynchronous approach accommodated teammates' availability while ensuring everyone's perspective shaped the solution. Ideas emerged around progressive onboarding, visual stress assessments, reorganized content libraries, and improved troubleshooting flows.

Asynchronous ideation workshop generated solutions across key problem areas, with the entire team contributing ideas on their schedule.
I also audited the entire pre-recorded content library, which had grown organically without clear structure. Working with the team, we studied how competitors like Calm and Headspace organized mindfulness programs. We sorted sessions into intuitive topics that matched how users actually think about stress relief; by need (sleep, emotions, focus).
With problem areas identified, solutions ideated, and content reorganized, I focused on the infrastructure that would let us build and iterate quickly.
The Foundation:
Before diving into high-fidelity designs, I built a comprehensive Figma design system: reusable, documented components covering buttons, navigation, forms, cards, and data visualizations. Given the breadth of redesign ahead, establishing this foundation would prevent inconsistencies and technical debt as we iterated rapidly. This investment paid future dividends. During the v4 redesign, the established patterns accelerated design work by 60% and reduced development time by 30%, allowing the team to focus on solving user problems rather than reinventing components.

Comprehensive Figma design system with documented components became the foundation for rapid iteration, reducing v4 project time by 30% compared to v3.
App v3:
Over 6 months, from ideation to launch, we systematically addressed each identified problem area. While not all features shipped in the initial release, we validated designs through prototype reviews with the CEO and team testing, gathering feedback from investors and colleagues to refine the experience before development.
Communicating Value Upfront
I introduced an intro screen that communicated the app's purpose and device functionality clearly, helping users understand what they could achieve before beginning setup.
Intro screen communicating app value and device functionality upfront, setting clear expectations before users began onboarding.
A Welcoming Onboarding Flow
I redesigned the onboarding to gather only essential information, giving users control over how many questions they wanted to answer. The focus was on reducing clicks and input time, making responses as snappy as possible to minimize effort.
Optional questions reduced psychological burden and eliminated the interrogation feeling, driving onboarding completion to 95%.
The Stress Assessment Reimagined
Rather than vague metrics, I refined the stress assessment flow based on established psychological principles with clear stress scores, supportive explanations, recommended usage patterns, and visual progress tracking.
Users could complete this during onboarding or revisit it anytime within the app, making it a flexible tool for measuring progress throughout their journey rather than a mandatory gate.
This gave users something the old app never did: clarity about where they were and where they were going.
New stress assessment breaks down scores into understandable components with recommendations, helping users understand their stress levels without feeling overwhelmed or judged
A Redesigned Content Library
I created a new content library housing individual sessions and series (previously called "Programs"), organized by topic and intent to match user needs in the moment rather than our internal production structure.
Reorganized content library with intuitive categories, making session discovery effortless.
Device Connection That Actually Worked
I completely redesigned the device pairing flow with visual setup guides, real-time pairing indicators, integrated troubleshooting guides, and step-by-step recovery paths.
Redesigned device pairing with clear visual feedback and integrated troubleshooting guides that anticipated common issues before users encountered them, dramatically reducing support dependency.
increase in user engagement from 20% to 80% active users
onboarding completion rate
reduction in support requests
reduction in device returns
The app was finally beautiful, functional, and most importantly, genuinely helpful. User feedback highlighted the visual appeal, but more meaningfully, users mentioned feeling less confused and more motivated.
What We Learned & What We Missed
The redesign succeeded in simplifying core interactions and removing friction. Users who made it through onboarding now stayed engaged. The design system accelerated our ability to iterate quickly. The collaborative approach with the dev team meant we built features that were technically sound, not just theoretically ideal.
v3 successfully increased engagement by 60% - a massive win. But analytics revealed that about 45% of users were skipping the stress assessment questions. Although that was an option they could exercise, it did suggest some improvements could be made.
Although app onboarding improved, the user journey revealed a gap between physically unboxing the device, setting it up, and actually starting that first session. Users still experienced uncertainty during this critical transition. We'd made things better, but we hadn't yet made them seamless.
App v4:
When Roga prepared to launch its v2 device; a new hardware product with new capabilities, we had a chance to apply what we'd learned. Research data showed users were eager to start using the device as soon as possible whilst gaining clarity on important information related to device usage upfront. I collaborated on refining the onboarding flow and the visual redesign.
We improved the intro screen to more effectively communicate the app's value and purpose, show important device usage information upfront, and set clear expectations before users began setup. This refinement gave users better context before commitment.
Enhanced intro screen in v4 (left) provides clearer context about the app's purpose and device functionality upfront, addressing the clarity issues users raised in v3 (right) feedback.
Enhanced intro screen in v4 provides clearer context about the app's purpose and device functionality upfront, addressing the clarity issues users raised in v3.
Learning from user behavior, the onboarding was redesigned to focus entirely on getting users to connect their device and into their first session as quickly as possible. Instead of front-loading information, we shifted to progressive disclosure, teaching users what they needed to know exactly when they needed to know it.
v4 onboarding focuses on device connection first, by deferring non-essential questions to be progressively disclosed during usage.
The new v2 device supported multiple stimulation modes and could receive new modes through firmware updates. I redesigned the device usage experience to accommodate this flexibility, ensuring users could discover and benefit from new capabilities.
Redesigned device usage experience supporting multiple stimulation modes and future firmware updates, building flexibility into the system for evolving product capabilities.
App v4 achieved what v3 hadn't fully solved: reducing the unboxing-to-first-session time by 75%, finally making the transition from physical device to digital experience seamless.
Takeaway
Progressive Disclosure Beats Comprehensive Onboarding
Users don't want to learn everything upfront. They want to experience value, then learn what they need when they need it. Teaching in context, not in advance drives better comprehension and retention.
Constraints Force Better Solutions
Limited research resources pushed us to extract maximum insight from existing data. We couldn't afford perfection, so we shipped improvements iteratively. This scrappy approach meant we learned faster than if we'd waited for "enough" research.
Design Systems Are Strategic Infrastructure
The upfront investment in the design system felt slow initially but enabled rapid iteration when we needed it most. Having established patterns meant the team could focus on solving user problems rather than reinventing components; turning weeks of work into days. More importantly, when stakeholders requested last-minute changes, we could accommodate them without derailing timelines.
Collaboration Beats Handoffs
Working closely with the CEO and dev team: understanding technical constraints, discussing tradeoffs, iterating together - resulted in better solutions than design-then-handoff ever could. Constraints became creative opportunities rather than roadblocks.
What I'd Do Differently
If I were starting this project today, I'd push harder for lightweight usability testing during v3 development. Even 5 unmoderated tests would have revealed the 45% skip rate before launch, letting us iterate faster. I'd also advocate for quantitative baseline metrics earlier; we didn't have precise tracking until midway through v3 development, which made it harder to measure incremental improvements.
Working on Roga reinforced that great design emerges from collaboration, trust, and shared ownership. Here's what one of the developers I worked with had to say:
"I highly recommend Nate as an exceptional UI/UX designer and team member. He's an awesome guy with truly impressive skills in design and Webflow. What sets him apart is his ability to function as both a talented designer and an idea maker. He's always positive, assertive, and ready to get the work done efficiently without compromising quality. His communication skills are outstanding - he takes nothing for granted and makes sure everyone fully understands the issues at hand before working them out as a team."
Juan Rueda
Software Developer, Roga
This project taught me that solving the right problem matters more than building the perfect solution. v3 successfully increased engagement by 60%, but it took stepping back to examine the entire user journey: from unboxing to first session, and asking "what are users actually trying to accomplish?" to unlock v4's improvements.
The foundation we built: the design system, the iterative process, the collaborative approach - means Roga can continue evolving the experience as the product grows, without rebuilding from scratch each time. That's the real measure of sustainable design work.
More importantly, this reinforced what I believe about great design: it happens at the intersection of user empathy, business constraints, and technical reality. It's messy, iterative, and collaborative - exactly how I want to keep working.
Ami Lebendiker (CEO)
Alison Smith (Chief Science Officer - v3)
Olawale Morenikeji (UX Designer - v3)
Agustin Apruzzese (Industrial Designer - 3D Renders)
Shak Saleemi (Software Developer - v3)
Juan Rueda (Software Developer)
Juan Diego Diaz (Software Developer)
Isabel Carreras (UX Researcher & Designer - v4)
Nathan Osei (UX & Visual Designer)