Live Set Viewer has a simple two-part architecture: a Max for Live device running inside Ableton Live, and a web application running in your browser. Understanding how they communicate helps explain why the viewer behaves the way it does and what its limitations are.
The Live Object Model
Everything in an Ableton Live session — tracks, devices, clips, parameters, routing — is represented as a tree of objects called the Live Object Model (LOM). The LOM is the same API that Max for Live devices use to interact with Live programmatically.
The Live Set Viewer device walks this tree to build a structural snapshot of your session. It reads properties like track names, colors, mute/solo/arm states, device names, device types, and chain structures. The result is a SetModel — a JSON-serializable tree of tracks, devices, chains, and sends that captures the shape of your session at a point in time.
Two-phase communication
The device communicates with the browser using a two-phase approach: an initial snapshot delivered over HTTP, followed by incremental updates streamed over WebSocket.
Phase 1: Snapshot
When you first open http://localhost:19741 in your browser, the page loads a small React application. That application immediately opens a WebSocket connection back to the device’s built-in server. On connection, the server sends a snapshot message containing the full SetModel — every track, every device, every chain, all at once.
This gives the viewer everything it needs to render the complete graph immediately, without waiting for individual pieces to trickle in.
Phase 2: Streaming updates
After the initial snapshot, the device watches for changes in the Live Object Model and sends targeted messages over the same WebSocket connection:
updatemessages carry property changes for a specific object (for example, a track being renamed or muted). The viewer applies these changes to its in-memory model without replacing the whole tree.refreshmessages carry a replacement subtree when structural changes occur (for example, a device being added or removed from a track).
This two-phase design keeps the viewer responsive. The snapshot ensures you see something meaningful the moment the page loads. The streaming updates keep the view accurate without polling or manual refreshing.
Why snapshot plus stream?
An alternative approach would be to stream every piece of data individually from the start, building the graph incrementally as messages arrive. The problem is that the Live Object Model can be large — a session with dozens of tracks and hundreds of devices would take noticeable time to stream piece by piece, and the viewer would render in a disorienting, piecemeal way.
Another alternative would be to poll the entire session on a timer, replacing the graph every few seconds. This would be simple but wasteful — most of the time, nothing changes, and when something does change, you would not see it until the next poll interval.
The snapshot-plus-stream approach gives you the best of both: instant full rendering on load, and immediate updates as changes happen.
The device internals
Inside the Max for Live device, two scripts work together:
- lom-handler runs in Max’s scripting environment and has direct access to the LiveAPI (the JavaScript interface to the Live Object Model). It handles low-level operations: reading properties, listing children, and setting up observation callbacks.
- The Node.js server (running via Max’s
node.scriptobject) builds the session model by sending commands to the lom-handler, serves the web application over HTTP, and manages WebSocket connections to browser clients.
The two scripts communicate through Max’s message-passing system. The Node.js server sends requests (like “get the name of track 3”) and the lom-handler responds with results. This separation exists because LiveAPI can only be accessed from Max’s scripting environment, not directly from Node.js.
Further reading
- Live Object Model reference — Cycling ‘74’s documentation for the object model that Live Set Viewer reads
- Visualization reference — what each node type and visual encoding means in the graph
- Why Live Set Viewer — the problem this tool solves and why a visual graph helps