Forage • Spot. Gather. Create. • 2026

Forage: Capturing Inspiration at the Speed of Perception

Role
Creative Direction and AI
Strategy
Timeline
FigBuild 2026 Design Challenge
Date
2026
Context
Team: Sooim Kang, Melody Ekbatani
Kinza Ghanchi, Lucy Trepanier
Tools
Figma, Figma Make, Rhino 3D, Midjourney
Adobe Illustrator, After Effects, Premiere Pro
ChatGPT, Gemini, Google Docs

Forage: Capturing Inspiration at the Speed of Perception

Forage is a speculative wearable and AI system designed to give designers a new sensory capability: detect, capture, and analyze moments of creative inspiration directly from the physical world. The system pairs a ring-based hardware sensor with a gesture interaction model and a web-based creative archive—capturing typography, color, motion, and imagery in real time without interrupting the moment of perception.

Captured signals are processed by an AI pipeline that extracts structured design assets and organizes them automatically into a searchable personal library. Instead of relying on memory, scattered camera rolls, or fragmented bookmarking tools, designers build a living archive of references sourced from everyday environments. Built for the FigBuild 2026 Design Challenge, the project explores how sensing and manipulating creative inputs can reshape what designers can capture, enhance, and build from over time.

Spot → Gather → Create system overview
Ring capture + gesture + AI archive flow

Inspiration Is a Sensory Experience Without a Tool

Creative professionals perceive the world through heightened sensitivity to visual signals—color gradients, typography on storefronts, the rhythm of ocean waves, and textures in the physical environment. These moments register as inspiration, but the tools available to capture them are fragmented, slow, and context-destroying.

Current capture methods all require deliberate interruption: camera rolls accumulate with no organization, screenshots pile up in folders, bookmarking captures only static links, and nothing reliably preserves motion, texture, or atmosphere alongside a reference. The result is a persistent gap between finding inspiration and actually using it.

In-the-wild inspiration vs interrupted capture
Static references that lose light, motion, and context

Inspiration as a Measurable Sense

The design process began with a reframe: inspiration isn’t primarily a memory problem or an organization problem. It is a sensory phenomenon—environmental signals designers instinctively perceive and respond to.

Beyond five senses Inspiration functions like a reflex—a subconscious response to visual and environmental patterns, not a deliberate step in any creative process.
The untracked sense Fitness trackers measure bodies and sensors measure environments, but nothing is built to capture the sensory experience that actually drives creative work.
Tools interrupt the moment Reaching for a phone, framing a photo, or opening an app interrupts attention and filters out the very context that created the spark.
Inspiration as a capturable sensory reflex
Where context gets lost in current capture tools

Can Inspiration Become a Capturable Sense?

If inspiration functions like a sense, what would it mean to give designers a tool to detect it, capture it, and build from it over time? The opportunity wasn’t to improve existing capture tools—it was to introduce an entirely new category: a spatial capture system that treats the physical world as a living design library.

Speed of perception Capture happens the moment a spark is noticed—no deliberate interruption, no decision about whether to stop and document what you're experiencing.
Screenless capture Natural hand gestures control every capture type without unlocking a screen or opening an app, keeping full attention on the environment around you.
AI-assisted organization Every capture is automatically grouped, tagged by type, and surfaced as a structured reference—no manual sorting, labeling, or filing required.
Opportunity: a new spatial capture category
Design principles for capture, gesture, and AI

A Three-Layer System: Wearable, AI, and Digital Archive

Hardware: the Forage Ring A snap-on module with a micro camera for environmental capture, motion sensors for gesture detection, and silent haptic feedback confirming each capture as it happens.
Screenless capture Gestures define what gets captured: outline in the air for an image, single tap for color, two-finger drag for typography, hold and drag for motion—no screen required.
Digital archive: the web app A structured library across four surfaces: Field for recent captures, Findings for AI-clustered patterns, Types for browsing by asset, and Boards for organizing project references.
Ring hardware concept + sensor integration
Gesture capture system + archive surfaces

Prototyping a System That Exists Beyond the Screen

The primary challenge was translating a subjective, invisible experience into something measurable, capturable, and usable. Testing and iteration focused on three areas: the physical hardware form, the gesture interaction model, and the digital interface.

Ring prototypes were developed in Rhino 3D to refine ergonomics, sensor placement, and jewelry-like aesthetics. The gesture vocabulary was tested for minimal complexity—achievable one-handed in public—while avoiding accidental triggers. One major design challenge was defining inspiration as data: rather than capturing inspiration itself, Forage focuses on the environmental signals most commonly associated with creative response—color relationships, typographic forms, motion patterns, visual composition, and texture. An interactive Figma prototype was deployed at forage.figma.site to validate the full flow from capture through AI organization into board management, confirming how designers mentally organize references across Field, Findings, Types, and Boards.

Rhino 3D ring iterations (ergonomics + camera placement)
Gesture model testing + Figma prototype flow

The World as a Living Design Library

The final experience mirrors the natural creative process through three phases: spotting, gathering, and creating.

Spot: the designer wears the Forage ring throughout the day. Double-tapping activates capture mode with subtle haptic confirmation, and the system passively senses the environment—no screen unlock, no app to open, and no deliberate interruption of attention.

Gather: when inspiration occurs, the designer draws a loose outline in the air to define the capture region, then performs the appropriate gesture to record the asset type. A soft haptic pulse confirms capture in under one second.

Example scenarios: a designer walking through the city spots a rhythm in a sidewalk advertisement and captures a structured visual reference; a typographer traces a distinctive letterform on a storefront awning with a two-finger drag to capture typography (OCR + font match); a motion designer holds and drags to record kinetic sculpture rhythm as an MP4/GIF preview plus a timing curve JSON for motion workflows.

Create: by the time the designer returns to the studio, Field is already organized. AI Findings surface pattern clusters across the day's captures, and assets are available as downloadable files (HEX codes, MP4 clips, timing curves, and font matches) that can be dragged into design files or added to project Boards.

Safeguards: the system automatically detects and avoids capturing identifiable faces, disables capture in sensitive locations, and keeps all captured signals fully visible and editable in the archive.

Spot → Gather → Create interaction flow
Safeguards + transparent archive management

Transforming Perception Into a Traceable Creative Process

Designer Inspiration captured at the moment of perception, reduced cognitive load from managing scattered references, and a structured, searchable personal archive built automatically.
Creative Workflow Design assets extracted and ready to use, AI-organized clusters that surface patterns across a creative archive, and direct export to design files or project Boards.
Creative Wellbeing Preserved context that would otherwise be lost, reduced friction between observation and creation, and a living record of a designer's creative sensibility over time.

Forage captures color, typography, motion, and imagery from the physical world with a single gesture—reducing the manual organization steps that typically happen after the fact.

From scattered references to a living inspiration archive
Impact across workflow + creative wellbeing

Designing a New Sense

This project reframed a problem designers experience daily but rarely articulate clearly: inspiration isn’t a documentation problem—it is a sensory phenomenon that existing tools weren’t designed to address. Committing to that reframe early changed everything about the system architecture, the interaction model, and the product vocabulary.

Building a speculative product that operates beyond the constraints of a screen required rethinking what an interface is. The gesture vocabulary, the haptic feedback system, and the wearable form factor became interface decisions that had to function as one coherent experience.

The most consequential learning: creative tools are most powerful when they support observation rather than interrupt it. Every feature was evaluated against that principle—and when AI categorization surfaced patterns a designer hadn’t consciously recognized, it created genuine value.

What is next for Forage: vector shape capture, surface texture and material pattern modes, voice interaction for hands-free annotation, AI-generated design variations based on recurring archive patterns, collaborative inspiration sharing and environmental mapping across cities, and deeper AI modeling of personal creative preferences over time.

Reframing inspiration as a capturable sense
Observation-first design decisions