ACID\\BOARD

00:00
00:00

ACID\\BOARD

Daydream Scope

Explore new worlds with Daydream Scope

Check out the latest model drops and powerful integrations.

Download Now

This project was developed during the 3.0 Interactive AI Video Program by Daydream.live. It's in the earliest stage and the following is an outline of where I landed by the end of the cohort.

ACID\\BOARD is an interactive, realtime AI-driven art screen that behaves like a living surface: it listens, interprets, and continuously re-renders itself in response to the room. Gestures, bodies, and ambient signals become inputs for a feedback loop of shifting textures, images, and synthetic “moods,” blurring the line between interface and atmosphere.

mockup image

mockup image

For this stage, I vibe coded a modified a version of the webgl fluid sim written by @paveldogreat to drive motion via hand tracking and microphone input into the sim via OBS then input that into Scope with Spout.

https://github.com/PavelDoGreat/WebGL-Fluid-Simulation.

Following this, I've  started to implement input via Touch Designer as I learned some new setups during workshops by Andrew Sun in the cohort.

hand tracking and mic input drives the liquid sim in browser then captured by OBS

hand tracking and mic input drives the liquid sim in browser then captured by OBS

I set up a 7 image data set to train a LoRA and guide the visual style - using the AI Toolkit from Ostris -  https://github.com/ostris/ai-toolkit 

  

LoRA training samples

LoRA training samples

The result runs locally on an RTX 4090 at 15-21 fps

Here is a link to a workflow:  https://app.daydream.live/workflows/mattkeff/supersquish-lora

here is the LoRA : https://huggingface.co/mattkeff/SUPERSQUISH