MIDI For Trees
Forest scenes become musical data
A real-time audio-visual system that detects individual trees in video feeds using multi-scale computer vision, examines their movement through optical flow analysis, and maps those movement characteristics to musical scales and MIDI events. Wind-swayed branches become melodies; forests become orchestras.
How it works
Multi-scale tree detection
Identifies individual trees in the scene regardless of distance or density, using multi-scale computer vision that adapts to different forest compositions and camera angles.
Optical flow movement analysis
Examines the movement patterns of each detected tree using optical flow — capturing the direction, speed, and character of wind-driven motion in real time.
Algorithmic composition
Maps movement characteristics to musical scales and MIDI events. The natural rhythm of swaying trees drives the composition, creating music that genuinely reflects what's happening in the forest.
Visual feedback overlays
Configurable display overlays show the analysis in action — tree detection regions, movement vectors, and musical mappings — so you can see exactly how the visual data becomes sound.
Technical specs
Use cases
- Generative music installations in forests, parks, and botanical gardens
- Studio compositions using recorded forest footage as a MIDI source
- Environmental art projects that give voice to trees and natural spaces
- Meditative or therapeutic sound experiences driven by nature
- Educational tools demonstrating the connection between visual movement and sound
Interested in MIDI For Trees?
We'd love to walk you through the system, answer questions, or discuss how it fits your workflow.