Cellular Automata, Custom LoRAs, and Real-Time AI Video

Cellular Automata, Custom LoRAs, and Real-Time AI Video

Cellular Automata, Custom LoRAs, and Real-Time AI Video

Daydream Scope

Explore new worlds with Daydream Scope

Check out the latest model drops and powerful integrations.

Download Now

Scope Track Project | Daydream Interactive AI Video Program - Custom LoRAs and Cellular Automata Plug-In

https://github.com/diegochavez-io/cellular-automata_plug-in

https://huggingface.co/diegochavez/daydream-scope-loras

1st I trained 4 custom LoRAs for Krea Realtime on an RTX PRO 6000. Small datasets (14-17 clips each), rich captions describing the motion and texture I wanted the model to learn. About $10-12 and 5-6 hours per LoRA.

Utilizing Ostris AI-Tool Kit | https://github.com/ostris/ai-toolkit

Utilizing Ostris AI-Tool Kit | https://github.com/ostris/ai-toolkit

AnimateDiff  - Original dataset footage

AnimateDiff - Original dataset footage

This LoRA has been evolving with me since Stable Diffusion 1.5. I later trained it on AnimateDiff to generate this synthesized motion, and now it's been retrained on Wan 2.1 T2V-14B for real-time video generation.

Step 1000

Step 1000

Step 1000 output is where the LoRA locked in. The motion was fast and hard to capture, but this checkpoint nailed the liquid chrome flow I was after. Teal and orange waves are constantly morphing, never settling.

AnimateDiff LoRA in the DayDream Scope + Krea Realtime 14B Pipeline

AnimateDiff LoRA in the DayDream Scope + Krea Realtime 14B Pipeline

AnimateDiff LoRA in the DayDream Scope + Krea Realtime 14B Pipeline

AnimateDiff LoRA in the DayDream Scope + Krea Realtime 14B Pipeline

TouchDesigner → CRT → analog glitch → Blackmagic 6K recapture

TouchDesigner → CRT → analog glitch → Blackmagic 6K recapture

I created visuals in TouchDesigner, output them through a real CRT television, ran the signal through analog glitch hardware, then recaptured it all with a Blackmagic Cinema 6K camera. Capturing real phosphor glow,  scan lines, and signal degradation. 17 clips of pure analog texture fed into the model. 

Step 1000

Step 1000

The CRT aesthetic started coming through around step 700. Chromatic aberration, phosphor bleed, and the scan line texture. It was already usable here, but I pushed it to step 1000 for the final checkpoint. A little more baked in, a little more committed to the look.

CRT LoRA in the DayDream Scope + Krea Realtime 14B Pipeline

CRT LoRA in the DayDream Scope + Krea Realtime 14B Pipeline

CRT LoRA in the DayDream Scope + Krea Realtime 14B Pipeline

CRT LoRA in the DayDream Scope + Krea Realtime 14B Pipeline

Demo of the LoRA kicking in with Text as input.
Input video as source with LoRA applied.

After building my LoRA datasets, I wanted them to be driven by cellular automata. A living, constantly evolving organism that would never repeat or go still. I prototyped the system in Python with Claude Code, starting from Lenia and SmoothLife simulations and layering in flow fields, multi-zone color, and radial containment until each organism felt alive.

While it was enjoyable to get everything up and running, the main objective was to implement this system within DayDream Scope as a real preprocessor. This would allow it to feed frames directly into Krea Realtime 14B along with my custom LoRAs.

Turning it into a plugin meant stripping out PyGame, outputting tensors instead of pixelbuffers, and figuring out Scope's pipeline contract.

Deployed it to a RunPod RTX 5090. The cellular automata drives the motion, the LoRA drives the aesthetic. Math feeds a neural network in real time!

First frames from the CA plugin running live in DayDream Scope.

First frames from the CA plugin running live in DayDream Scope.

Three days from a pygame prototype to a working preprocessor feeding Krea Realtime 14B

Three days from a pygame prototype to a working preprocessor feeding Krea Realtime 14B

The plugin is open source, with 4 CA engines, all of which are controllable live from the Scope UI. Install it directly from GitHub:

Loading...

I got the plugin working, generating Cellular Automata as video input in my LoRa stylized!