Trees that sing
During the development of our flagship vision-to-MIDI product, Ljómi, I reached out to many musicians and performers, inviting their collaboration on the project. One of those artists was the ambient and soundscape musician, Ishq.
Over the course of several weeks, we established an email conversation and one of those emails particularly piqued my interest. Ishq mentioned how, when watching trees swaying in the wind, he thought how lovely it would be if there was a system that created generative music from the movement of trees. The idea for MIDI For Trees was born! Thank you, Ishq!
That was a few weeks ago. But in that short time, MIDI For Trees has taken shape. Unlike VisionMIDI Live, which uses an AI computer vision model at its core, MIDI For Trees uses a more traditional vision model to analyse the colours present in trees, combined with optical flow calculations to determine their movement.
And it’s that movement that is the basis for triggering MIDI notes and MIDI CC messages in the product.
MIDI For Trees was built quickly largely due to my experience on Ljómi and VisionMIDI Live; I knew what was needed and how to deliver it.
Assuming the rain stays away, I’ll be taking my laptop, camera and headphones out into the woods in order to let nature guide my compositions.