Forage • Spot. Gather. Create. • 2026
Forage: Capturing Inspiration at the Speed of Perception
Overview
Forage: Capturing Inspiration at the Speed of Perception
Forage is a speculative wearable and AI system designed to give designers a new sensory capability: detect, capture, and analyze moments of creative inspiration directly from the physical world. The system pairs a ring-based hardware sensor with a gesture interaction model and a web-based creative archive—capturing typography, color, motion, and imagery in real time without interrupting the moment of perception.
Captured signals are processed by an AI pipeline that extracts structured design assets and organizes them automatically into a searchable personal library. Instead of relying on memory, scattered camera rolls, or fragmented bookmarking tools, designers build a living archive of references sourced from everyday environments. Built for the FigBuild 2026 Design Challenge, the project explores how sensing and manipulating creative inputs can reshape what designers can capture, enhance, and build from over time.
The Problem
Inspiration Is a Sensory Experience Without a Tool
Creative professionals perceive the world through heightened sensitivity to visual signals—color gradients, typography on storefronts, the rhythm of ocean waves, and textures in the physical environment. These moments register as inspiration, but the tools available to capture them are fragmented, slow, and context-destroying.
Current capture methods all require deliberate interruption: camera rolls accumulate with no organization, screenshots pile up in folders, bookmarking captures only static links, and nothing reliably preserves motion, texture, or atmosphere alongside a reference. The result is a persistent gap between finding inspiration and actually using it.
Research Insights
Inspiration as a Measurable Sense
The design process began with a reframe: inspiration isn’t primarily a memory problem or an organization problem. It is a sensory phenomenon—environmental signals designers instinctively perceive and respond to.
Defining the Opportunity
Can Inspiration Become a Capturable Sense?
If inspiration functions like a sense, what would it mean to give designers a tool to detect it, capture it, and build from it over time? The opportunity wasn’t to improve existing capture tools—it was to introduce an entirely new category: a spatial capture system that treats the physical world as a living design library.
Design Exploration
A Three-Layer System: Wearable, AI, and Digital Archive
Test & Iterate
Prototyping a System That Exists Beyond the Screen
The primary challenge was translating a subjective, invisible experience into something measurable, capturable, and usable. Testing and iteration focused on three areas: the physical hardware form, the gesture interaction model, and the digital interface.
Ring prototypes were developed in Rhino 3D to refine ergonomics, sensor placement, and jewelry-like aesthetics. The gesture vocabulary was tested for minimal complexity—achievable one-handed in public—while avoiding accidental triggers. One major design challenge was defining inspiration as data: rather than capturing inspiration itself, Forage focuses on the environmental signals most commonly associated with creative response—color relationships, typographic forms, motion patterns, visual composition, and texture. An interactive Figma prototype was deployed at forage.figma.site to validate the full flow from capture through AI organization into board management, confirming how designers mentally organize references across Field, Findings, Types, and Boards.
Final Product
The World as a Living Design Library
The final experience mirrors the natural creative process through three phases: spotting, gathering, and creating.
Spot: the designer wears the Forage ring throughout the day. Double-tapping activates capture mode with subtle haptic confirmation, and the system passively senses the environment—no screen unlock, no app to open, and no deliberate interruption of attention.
Gather: when inspiration occurs, the designer draws a loose outline in the air to define the capture region, then performs the appropriate gesture to record the asset type. A soft haptic pulse confirms capture in under one second.
Example scenarios: a designer walking through the city spots a rhythm in a sidewalk advertisement and captures a structured visual reference; a typographer traces a distinctive letterform on a storefront awning with a two-finger drag to capture typography (OCR + font match); a motion designer holds and drags to record kinetic sculpture rhythm as an MP4/GIF preview plus a timing curve JSON for motion workflows.
Create: by the time the designer returns to the studio, Field is already organized. AI Findings surface pattern clusters across the day's captures, and assets are available as downloadable files (HEX codes, MP4 clips, timing curves, and font matches) that can be dragged into design files or added to project Boards.
Safeguards: the system automatically detects and avoids capturing identifiable faces, disables capture in sensitive locations, and keeps all captured signals fully visible and editable in the archive.
Impact
Transforming Perception Into a Traceable Creative Process
Forage captures color, typography, motion, and imagery from the physical world with a single gesture—reducing the manual organization steps that typically happen after the fact.
Reflection
Designing a New Sense
This project reframed a problem designers experience daily but rarely articulate clearly: inspiration isn’t a documentation problem—it is a sensory phenomenon that existing tools weren’t designed to address. Committing to that reframe early changed everything about the system architecture, the interaction model, and the product vocabulary.
Building a speculative product that operates beyond the constraints of a screen required rethinking what an interface is. The gesture vocabulary, the haptic feedback system, and the wearable form factor became interface decisions that had to function as one coherent experience.
The most consequential learning: creative tools are most powerful when they support observation rather than interrupt it. Every feature was evaluated against that principle—and when AI categorization surfaced patterns a designer hadn’t consciously recognized, it created genuine value.
What is next for Forage: vector shape capture, surface texture and material pattern modes, voice interaction for hands-free annotation, AI-generated design variations based on recurring archive patterns, collaborative inspiration sharing and environmental mapping across cities, and deeper AI modeling of personal creative preferences over time.