Acuity • DESIGNPRENEURS HACKATHON • Feb 28, 2026

Helping people recognize early cardiovascular warning signs before a crisis.

Role
Product Designer
Timeline
24 hours
Date
February 28, 2026
Context
Team: Kritika Singh, Sooim Kang, Lucy Trepanier • TNS AI Startup Design Hackathon, Parsons School of Design
Tools
Figma Make, Midjourney, Sora, ChatGPT, Claude

Helping People Recognize Early Cardiovascular Warning Signs Before a Crisis

ACUITY is an AI-powered wearable health system designed to help people move from uncertainty to action. The system continuously monitors biometric signals, learns each user's personal health baseline, and translates deviations into clear risk classifications. The result is a product that removes the guesswork between feeling something is wrong and knowing what to do about it.

The concept was developed in 24 hours at the Designpreneurs Hackathon at Parsons School of Design, a competition challenging teams to design AI systems for a 2035 world. The team advanced from a human story to a system concept, functional prototype, product demo video, and final pitch presentation within a single day.

50% of people feel warning signs before cardiac arrest, but only 20% act (Marijon et al., 2016)
31% higher cardiovascular risk in underserved communities; 9/10 adults lack health literacy to know when to seek care (CDC, 2024). Most cases go undetected (ACC, 2024).

The Most Dangerous Moment in Healthcare Is the Uncertainty Before the Emergency

People experiencing early cardiovascular symptoms frequently cannot determine whether what they are feeling is serious enough to warrant action. The warning signals are present, but no translation layer exists between symptom and response.

Research testimonials collected during discovery

Heart Attack Survivor "I kept telling myself it was just stress. Turns out, it was my heart."
Family of a Victim "It wasn't the cardiac arrest that killed him. It was his fear."
Pre-Med Student "I still don't know if this is serious enough to trouble the doctor."

Existing wearables are highly capable at measuring health data. However, measurement without interpretation leaves users in precisely the same state of uncertainty they began with. Data collection is not the problem. Clarity is.

Uncertainty at the moment of decision
Warnings present, interpretation missing

Understanding How Uncertainty Prevents Action

The project was grounded in existing cardiovascular research and primary user testimony to understand the conditions under which people delay or avoid care even when symptoms are present.

Symptoms do not follow textbook patterns. Many early warning signs—including dizziness, chest discomfort, and sudden fatigue—are subtle and inconsistent. Users tend to rationalize these experiences as stress or tiredness rather than recognizing them as signals that warrant attention.

Health literacy gaps are systemic. Nine out of ten adults lack the health literacy to independently assess when to seek medical care. This reflects a fundamental gap in how healthcare systems communicate urgency to general populations.

High-risk populations remain underserved. Communities with 31% higher cardiovascular risk are also least likely to have access to tools capable of detecting early warning signs before escalation. The population most in need is the one most frequently missed by existing systems.

Fear delays action as much as a lack of information. Users may fear being incorrect, inconveniencing a clinician, or confronting a diagnosis. Effective design must reduce friction at the moment of decision, not only at the point of data collection.

Subtle, inconsistent symptoms that get rationalized
Health literacy + fear reduce timely action

Making the Body's Signals Legible and Actionable Before Crisis

The initial design direction targeted a broad health monitoring platform. General health tools, however, tend to produce shallow impact because their scope is too wide and their outputs too abstract to drive behavior change in high-stakes moments.

The focus was narrowed to a specific and underserved window in the patient experience: the interval between “something feels off” and “I know what to do.” The guiding design question became: how might people be helped to understand when something is wrong and take action before it is too late?

Design principles

  • Translate rather than measure: Surface meaning from signals rather than presenting raw data
  • Reduce friction at the moment of decision: Deliver one clear next action rather than a dashboard
  • Build for the highest-risk users: Design for the population most likely to be missed by existing systems
  • Earn trust through personalization: Learn each user's baseline before classifying risk
One clear next action per decision
Personal baseline modeling for trust

Designing a System Rather Than an Interface

ACUITY is constructed across three integrated layers: wearable sensing, AI interpretation, and app-guided action. These layers connect into a single decision system oriented around one output.

System architecture Three layers, one decision.

  • Stable: Signals remain within the user's personal baseline. No action is required.
  • Elevated: Meaningful deviation has been detected. A medical evaluation is recommended.
  • Emergent: Significant changes paired with reported symptoms require immediate attention.
Stable, Elevated, and Emergent decision tiers
Risk classification logic and state transitions

What ACUITY detects

  • Blood pressure deviations from the user's established personal baseline
  • Heart rate variability patterns over time
  • Sleep quality and consistency metrics
  • User-logged symptoms (because physiological changes do not always appear in biometric data)
Multi-signal biometric inputs and sensing overview
User-logged symptom layer alongside biometric data

AI engine

  • Individual baseline modeling that learns what is physiologically normal for each specific user
  • A risk classification engine that interprets deviation across multiple signal types simultaneously
  • Time-series anomaly detection that identifies patterns before they reach critical thresholds

Strategic focus: the design narrows to patients experiencing early cardiovascular symptoms that do not follow classic textbook presentations—because this group is frequently underserved and most likely to benefit from early intervention.

Personal baseline modeling and calibration period
Multi-signal interpretation with time-series anomaly detection

Designing the Core Interaction Under Constraint

Within the 24-hour hackathon timeline, rapid prototyping and AI-assisted tools were used to develop and evaluate the core interaction model. Figma Make enabled fast generation and iteration of the mobile app interface and interaction flows. Midjourney supported visual concept development for the wearable hardware form factor.

Key interaction decisions

  • Two wrist vibrations signal the need for attention. The pattern is subtle enough not to alarm and distinct enough to prompt engagement.
  • A deliberate pinch gesture activates the care card on the paired device, requiring intentional input rather than accidental contact.
  • A wave gesture dismisses an alert. Holding the pinch keeps the system in active monitoring mode.
  • No dashboards or raw metrics are surfaced to the user. The interface delivers one recommended next action per alert.

The gesture-based interaction model and three-tier risk classification performed well during prototype evaluation sessions. Users responded more effectively to being told what to do than to being shown data. Clarity consistently outperformed completeness as a guiding design value.

Wrist vibrations + pinch/wave gesture flow
Clarity beats completeness in high-stakes moments

A Personal Health Decision Layer: From Signal to Action

ACUITY operates less like a traditional fitness tracker and more like a personal health decision layer. The final product integrates continuous wearable sensing, AI-driven baseline modeling, and a minimal app interface into a unified experience designed around one central question: What action, if any, should be taken next?

Core product features

  • A wearable ring continuously monitors blood pressure, heart rate variability, sleep quality, and cardiac patterns
  • A 14-day baseline calibration period personalizes risk thresholds to the individual user's physiology
  • Three-state risk classification delivers Stable, Elevated, or Emergent status at any given moment
  • Gesture-based interaction eliminates the need for complex interface navigation under conditions of stress or uncertainty
  • A care card surfaces one clear recommended action per alert, removing ambiguity from the response process

Demo and pitch: a short product demo video was produced for the final presentation to illustrate how ACUITY supports a user experiencing early cardiovascular symptoms. The pitch was structured around a single core insight: people do not lack health data. They lack clarity about what that data means and when to act on it. ACUITY closes that gap.

A care card that delivers one clear recommended action
14-day baseline calibration for personalized risk thresholds

From Individual Clarity to Systemic Change

User: clarity on when to act before a crisis occurs; reduced anxiety resulting from health uncertainty; personalized risk interpretation in place of raw data.

Business (Insurers): $6M saved per 50,000 high-risk members; 2 to 5x ROI on PMPM investment; 100 or more hospital admissions prevented annually.

Healthcare System: earlier intervention at significantly lower cost; reduced emergency room burden from preventable events; scalable reach into underserved, high-risk populations.

Business model: ACUITY is structured for B2B scalability, funded by health insurers through a per-member-per-month (PMPM) model with zero cost to the end user—positioning ACUITY as an infrastructure investment for health systems targeting high-risk populations at scale.

More timely action and reduced anxiety
Projected insurer savings and prevented admissions

From Story to System in 24 Hours

This project reinforced that the most consequential design decisions are structural rather than visual. The most effective version of ACUITY was not the version with the greatest number of features. It was the version that accomplished one objective with precision: communicating what action to take next.

Operating within a compressed 24-hour timeline with a cross-university team accelerated every decision in the process. The constraint eliminated ambiguity and produced a level of clarity in concept development, interface design, and pitch narrative that longer timelines do not consistently generate.

Artificial intelligence tools played a significant role in enabling the pace of execution throughout the project. ChatGPT and Claude supported concept development and narrative refinement. Midjourney generated wearable device concept imagery. Figma Make enabled rapid interface prototyping. Sora produced cinematic footage for the product demo video. The experience demonstrated that AI tools deliver the most value when directed with clear design intent, rather than used as a substitute for design judgment.

The central takeaway from the experience is that as production timelines compress through AI-assisted workflows, the value of design shifts decisively toward judgment, direction, and systems thinking. ACUITY originated as a single question: what if people could understand the signals their body was sending before it was too late? Within 24 hours, that question became a fully articulated product concept recognized among the standout projects of the event.

Full reflection published on Medium by Sooim Kang and Lucy Trepanier.

Read on Medium

Structural clarity over feature volume
AI-assisted pace with design judgment