How the System Works — From City Signals to Music
Concept
Cities are alive. They breathe, pulse, evolve — just like music.
This project captures that energy by collecting live sensor readings across Amsterdam and using those data streams to generate global synchronized modular performances.
The result:
A constantly evolving data-driven sound composition where the city itself becomes the performer.
I designed a real-time sensing network that continuously captures movement, sound, and light patterns from different neighborhoods across Amsterdam. Each sensor cluster detects the intensity and rhythm of urban activity and converts it into JSON data.
The pipeline:
Movement / Sound / Light Sensors
⬇
Local Processing Unit → JSON Data
⬇
Database + Event Filtering
⬇
Translation Module → MIDI + Eurorack CV
⬇
Musical Output with Correct Scales & Octaves

The translation module analyzes the incoming data:
- Determines which events trigger notes
- Maps intensity to velocity / filter depth / amplitude
- Converts rate-of-change into LFO and modulation signals
- Detects patterns for sequencing behavior
To preserve musical coherence, each neighborhood in Amsterdam is assigned:
- A unique musical scale (e.g., Major, Minor, Pentatonic)
- A defined octave range
- Parameter behavior that reflects the area’s personality

For example:
| Neighborhood | Assigned Scale | Behavior |
|---|---|---|
| Jordaan | D Minor | Expressive modulations + slow LFO |
| De Pijp | C Major | Bright chords + upbeat tempo |
| Noord | Microtonal / Experimental | More chaotic responses |
| Centrum | Chromatic | Dense note clusters + high activity |
So as the city moves, each district contributes its own musical identity, forming a polyrhythmic, multi-scaled orchestration that is always evolving — because the city itself is alive.