Skip to content
Ljómi
All posts
·Andy Deighton

Ljómi: 10 Years in the Making

How a single night under the aurora borealis in Iceland sparked a decade-long vision — and how modern tools finally made it real.


Over a decade ago I witnessed the Aurora Borealis in Iceland. It was one of those experiences that rewires something in you — shifting horizon-to-horizon arcs of glowing green ribbons, moving and pulsing overhead in a way that felt genuinely alive.

Standing under it, I started thinking about how ancient peoples might have perceived the Northern Lights. Without any framework to explain what they were seeing, the aurora must have felt godlike — a presence, not a phenomenon. And I wondered: what would it sound like if it could speak?

The idea

Years later, discussing that experience with a friend, the concept crystallised. What if we could convert the visual phenomenon of the aurora into audio? Not a simulation or a soundtrack laid on top, but a real-time translation — the actual colours, movements, and patterns of the aurora driving the music directly.

I imagined something atmospheric and otherworldly. The Lights producing a sound that felt as vast and shifting as they looked.

The problem was that I didn't have the technical expertise to build it. The idea went on the shelf.

What changed

What made Ljómi possible was a set of breakthroughs in AI-assisted development tools. Platforms like Cursor, Windsurf, Claude, and Deepseek fundamentally changed what a single person could build. They didn't write the software for me — but they made it possible to move from concept to functional code at a pace that would have been unthinkable a few years ago.

After several months of development, specification work, and iterative refinement, I had a working system. Ljómi — named after the Icelandic word for light or gleam — could take a live aurora feed and produce MIDI output in real time.

Where we are now

Ljómi analyses aurora footage for colour (green, pink, purple), intensity, movement direction and speed, wave patterns, and area coverage. All of that data is translated into notes, chords, and MIDI CC values that you can route to any instrument or synth.

The ten-year gap between the idea and the product taught me something: timing matters as much as vision. The idea was always sound. The tools just needed to catch up.