Forage phone
Forage slides

Forage • Spot. Gather. Create. • 2026

Forage: Capturing Inspiration at the Speed of Perception

Role
Creative Direction + AI Strategy
Timeline
Hackathon sprint
Date
2026
Context
FigBuild 2026 team project
Tools
Figma, Figma Make, Rhino 3D, Midjourney, Adobe Suite, ChatGPT, Gemini

Forage: Capturing Inspiration at the Speed of Perception

Forage is a speculative wearable and AI system designed to give designers a new sensory capability: detect, capture, and analyze moments of creative inspiration directly from the physical world. The system pairs a ring-based hardware sensor with a gesture interaction model and a web-based creative archive, capturing typography, color, motion, and imagery in real time without interrupting the moment of perception.

Captured signals are processed by an AI pipeline that extracts structured design assets and organizes them automatically into a searchable personal library. Instead of relying on memory, scattered camera rolls, or fragmented bookmarking tools, designers build a living archive of references sourced from everyday environments. Built for the FigBuild 2026 Design Challenge, the project explores how sensing and manipulating creative inputs can reshape what designers can capture, enhance, and build from over time.

Forage overview

Inspiration Is a Sensory Experience Without a Tool

Creative professionals perceive the world through heightened sensitivity to visual signals: color gradients, typography on storefronts, the rhythm of ocean waves, and textures in the physical environment. These moments register as inspiration, but the tools available to capture them are fragmented, slow, and context-destroying.

Current capture methods all require deliberate interruption: camera rolls accumulate with no organization, screenshots pile up in folders, bookmarking captures only static links, and nothing reliably preserves motion, texture, or atmosphere alongside a reference. The result is a persistent gap between finding inspiration and actually using it.

Designer sensing the world
Inspiration capture types
Designer frustrations with inspiration tools

Inspiration as a Measurable Sense

The design process began with a reframe: inspiration isn’t primarily a memory problem or an organization problem. It is a sensory phenomenon: environmental signals designers instinctively perceive and respond to.

Beyond five senses Inspiration functions like a reflex, a subconscious response to visual and environmental patterns, not a deliberate step in any creative process.
The untracked sense Fitness trackers measure bodies and sensors measure environments, but nothing is built to capture the sensory experience that actually drives creative work.
Tools interrupt the moment Reaching for a phone, framing a photo, or opening an app interrupts attention and filters out the very context that created the spark.
Research inspiration 1
Research inspiration 2
Research inspiration 3
Research inspiration 4

Can Inspiration Become a Capturable Sense?

If inspiration functions like a sense, what would it mean to give designers a tool to detect it, capture it, and build from it over time? The opportunity wasn’t to improve existing capture tools; it was to introduce an entirely new category: a spatial capture system that treats the physical world as a living design library.

Speed of perception Capture happens the moment a spark is noticed, with no deliberate interruption and no decision about whether to stop and document what you're experiencing.
Screenless capture Natural hand gestures control every capture type without unlocking a screen or opening an app, keeping full attention on the environment around you.
AI-assisted organization Every capture is automatically grouped, tagged by type, and surfaced as a structured reference with no manual sorting, labeling, or filing required.
Team brainstorming at FigBuild
Writing on whiteboard at FigBuild
Gesture research hand positions

A Three-Layer System: Wearable, AI, and Digital Archive

Hardware: the Forage Ring A snap-on module with a micro camera for environmental capture, motion sensors for gesture detection, and silent haptic feedback confirming each capture as it happens.
Screenless capture Gestures define what gets captured: outline in the air for an image, single tap for color, two-finger drag for typography, hold and drag for motion. No screen required.
Digital archive: the web app A structured library across four surfaces: Field for recent captures, Findings for AI-clustered patterns, Types for browsing by asset, and Boards for organizing project references.

The World as a Living Design Library

The final experience mirrors the natural creative process through three phases: spotting, gathering, and creating.

Spot: the designer wears the Forage ring throughout the day. Double-tapping activates capture mode with subtle haptic confirmation, and the system passively senses the environment, with no screen unlock, no app to open, and no deliberate interruption of attention.

Gather: when inspiration occurs, the designer draws a loose outline in the air to define the capture region, then performs the appropriate gesture to record the asset type. A soft haptic pulse confirms capture in under one second.

Example scenarios: a designer walking through the city spots a rhythm in a sidewalk advertisement and captures a structured visual reference; a typographer traces a distinctive letterform on a storefront awning with a two-finger drag to capture typography (OCR + font match); a motion designer holds and drags to record kinetic sculpture rhythm as an MP4/GIF preview plus a timing curve JSON for motion workflows.

Create: by the time the designer returns to the studio, Field is already organized. AI Findings surface pattern clusters across the day's captures, and assets are available as downloadable files (HEX codes, MP4 clips, timing curves, and font matches) that can be dragged into design files or added to project Boards.

Safeguards: the system automatically detects and avoids capturing identifiable faces, disables capture in sensitive locations, and keeps all captured signals fully visible and editable in the archive.

Forage Web App

The Forage web app is live. Explore the archive interface, browse captured assets by type, and see how the system organizes real-world inspiration into a structured design library. Try it yourself below.

Final Product Video

Transforming Perception Into a Traceable Creative Process

Designer Inspiration captured at the moment of perception, reduced cognitive load from managing scattered references, and a structured, searchable personal archive built automatically.
Creative Workflow Design assets extracted and ready to use, AI-organized clusters that surface patterns across a creative archive, and direct export to design files or project Boards.
Creative Wellbeing Preserved context that would otherwise be lost, reduced friction between observation and creation, and a living record of a designer's creative sensibility over time.

Forage captures color, typography, motion, and imagery from the physical world with a single gesture, reducing the manual organization steps that typically happen after the fact.

Designing a New Sense

This project reframed a problem designers experience daily but rarely articulate clearly: inspiration isn’t a documentation problem; it is a sensory phenomenon that existing tools weren’t designed to address. Committing to that reframe early changed everything about the system architecture, the interaction model, and the product vocabulary.

Building a speculative product that operates beyond the constraints of a screen required rethinking what an interface is. The gesture vocabulary, the haptic feedback system, and the wearable form factor became interface decisions that had to function as one coherent experience.

The most consequential learning: creative tools are most powerful when they support observation rather than interrupt it. Every feature was evaluated against that principle, and when AI categorization surfaced patterns a designer hadn’t consciously recognized, it created genuine value.

What is next for Forage: vector shape capture, surface texture and material pattern modes, voice interaction for hands-free annotation, AI-generated design variations based on recurring archive patterns, collaborative inspiration sharing and environmental mapping across cities, and deeper AI modeling of personal creative preferences over time.

Team at FigBuild
FigBuild 2026 swag
Figma for Edu polaroid
FigBuild stamp activity
Team whiteboard session

Full Slide Deck