





Building My 40×40 Sensor Matrix for Touch and Weight Sensing → JSON → WebSocket → MIDI → Chromatic Synthesis
After finishing my speaker cubes, I wanted to build something interactive — a surface that could both sense touch and pressure and transform those gestures directly into sound.
That idea turned into a custom 40×40 sensor matrix (1,600 sensors total), capable of reading both touch and weight at high resolution, sending live data through WebSockets, and translating it into MIDI notes for real-time synthesis.
Hardware Design
The matrix consists of 40 columns and 40 rows, forming a 1,600-point sensing grid.
Each intersection measures both touch contact (presence) and pressure (force). I designed the PCB with multiplexed sensing lines to keep the wiring manageable and used an analog front end capable of detecting fine variations in resistance from each cell.
The system continuously scans the matrix, collecting values for every sensor in milliseconds. It can capture complex gestures — multiple fingers, pressure gradients, and dynamic movement patterns.
Firmware & Data Output
The firmware was written for an embedded microcontroller, which reads all 1,600 sensors in sequence, filters the noise, and converts the values into a structured JSON object.
Each JSON frame includes the full pressure map — an array of values from 0 to 1023 — along with timestamps for synchronization.
To send the data in real time, I implemented a WebSocket server on the device.
This allows a Node.js client to connect over the local network and receive continuous JSON streams with low latency.

Node.js Translation Layer
On the Node.js side, I built a small application that:
- Receives JSON data from the matrix through the WebSocket connection.
- Interprets the touch patterns and identifies active pressure zones.
- Translates these zones into MIDI note events using standard Note On/Off messages.
Each touch corresponds to a note, and the pressure level determines velocity (note intensity). The more pressure applied, the stronger the note — just like expressive keyboard dynamics.
Scale Filtering & Chromatic Synthesis
After generating raw MIDI data, the Node app filters the notes through musical scales — major, minor, pentatonic, and chromatic modes — depending on the selected profile.
This means the user can “play” the matrix freely while still staying in key.
The filtered notes are then sent to a software synthesizer, which converts the MIDI signals into sound. The result is an expressive, fully touch-driven instrument — part controller, part synthesizer, capable of chromatic and dynamic expression.
Result & Future Ideas
The matrix performs beautifully. It reacts instantly to touch and weight, turning simple gestures into real-time sound synthesis.
It’s not just a controller — it’s a musical surface, sensitive enough to express emotion through pressure and movement.
Future updates will include:
- Polyphonic gesture tracking
- Dynamic mapping of zones to instruments
- Custom scale and tuning options
- Visualization of pressure maps in real time