Real-Time Canvas — 3D Texture Mode Real-Time Canvas (3D Texture Mode) convierte el dibujo colaborativo en una textura en vivo lista para materiales 3D. Cada sala genera una stream continua que puedes aplicar directamente sobre un modelo (UVs) en TouchDesigner para lookdev, shows y prototipado exprés de assets “skinables”. How it works (3 steps) Open the app — your room appears in the UI (copy/share the link). Draw on a 512×512 canvas — while you draw, frames stream live; on release, a clean texture output is generated (PNG) for stable mapping and compositing. Connect TouchDesigner (3D material) — use a WebSocket DAT to: wss://…/ws?room=YOUR_CODE Receive the canvas and route it into your material basecolor/emission/opacity chain (PBR or stylized), so the model updates instantly. Why use it (business mode ON) Instant 3D lookdev: iterate textures live without re-exporting anything. Audience-driven skins: perfect for participatory installations (people “paint” the object live). Fast pipeline to mapping/LED: the 3D object can go straight to your render/output pipeline. Clean delivery: live frames for responsiveness + final PNG for consistency (no “dirty” edges). TouchDesigner: ready to go (3D edition) We include a preconfigured .toe focused on 3D texturing: websocket1 already targets the WSS endpoint — only change the room. Python callbacks installed (live frames + final PNG). A TOP output ready to feed a PBR Material / GLSL / Custom MAT network. Optional: quick toggles for basecolor / emission / alpha so you can art-direct the shader without touching the app. Best use cases Live “wrap” textures on products, props, masks, characters (UV-mapped). Workshops: teach UV/material basics with instant feedback. Shows: VJ-style texture improvisation on 3D objects. More info & discussion: https://mappingon.es/el-futuro-de-la-creatividad-arte-y-tecnologia/
Real-Time Canvas is a live drawing app built for shows, workshops, and rapid creative prototyping. Every user gets a unique room (shareable URL) to collaborate with a team or integrate seamlessly with TouchDesigner.
How it works (3 steps)
Open the app — your room appears in the UI (copy/share the link).
Draw on a 512×512 canvas — while you draw, live updates stream in real time; on release, a transparent PNG is generated for clean compositing.
Connect TouchDesigner — use a WebSocket DAT to
wss://…/ws?room=YOUR_CODE
and add a Movie File In TOP (canvas_in) to see the canvas instantly.
Why use it
Low latency with isolated rooms.
Stylus-friendly and great on touch devices.
Direct pipeline to TouchDesigner and media servers.
Perfect for participatory performances, installations, and generative visuals.
TouchDesigner: ready to go
We include a preconfigured .toe file:
websocket1 already points to the WSS endpoint — only change the room.
Python callbacks are installed (receive live JPEG frames, final PNG, and the prompt).
A Movie File In TOP (canvas_in) and a Text component to display the prompt.
websocket1 already points to the WSS endpoint — only change the room.
Python callbacks are installed (receive live JPEG frames, final PNG, and the prompt).
A Movie File In TOP (canvas_in) and a Text component to display the prompt.
Automatic saving: when an image arrives, the script creates/updates a folder next to your .toe (by default ./inbox/) and writes the canvas image there. If that path isn’t writable, it falls back to Documents/Real-Time-Canvas or the system temp folder.