Frost Bytes

00:00
00:00

Frost Bytes

Daydream Scope

Explore new worlds with Daydream Scope

Check out the latest model drops and powerful integrations.

Download Now

Daydream AI Video Program - Scope Track Project 

"Frost Bytes" is an immersive Unreal Engine 5 virtual production where X4NTHA performs a live-coded Tidal Cycles electronic music set, orchestrating a soundscape that is visually mirrored by a real-time, generative AI video stream of an abstract Aurora Borealis. This project explores the interplay between rigid algorithmic code and fluid, dreamlike AI imagery, creating a cozy atmosphere that turns technical limitations (like video artifacts) into an immersive artistic statement. The pipeline intelligently synchronizes OSC audio data and prompt text generated in real time by Tidal Cycles from code with a remote WebRTC AI video stream to drive reactive lighting and skybox textures, resulting in a cohesive, latency-managed audiovisual broadcast.

Conceptual Mock-up

Conceptual Mock-up

Technical Architecture

  • Core Engine: Unreal Engine 5.7.
  • Audio/Control: Tidal Cycles → SuperCollider → OSC (LAN) → UE5 OSC Plugin.
  • Code Display: Secondary Machine IDE (Neovim) → HDMI Capture Card → UE5 Media Player Framework.
  • AI Visuals: UE5 OSC/prompt processing → RunPod (Daydream Scope API) → WebRTC Stream → UE5 C++ YUV decode/HLSL node mapping logic.

I am working toward an MVP implementation where UE5 ingests OSC data to actively drive a localhost Scope instance, resulting in correctly rendered video within the engine.

Frost Bytes – Technical Progress Update

Development has focused heavily on building Muxurana, a  custom, soon-to-be open-sourced middleware designed to bridge the gap between a remote WebRTC video stream and real-time virtual production. Rather than relying on browser sources or resource-intensive WebRTC conversion in UE5, Muxurana is a headless Python application that utilizes aiortc for native WebRTC ingestion and ndi-python for local broadcasting over NDI. It features a threaded frame processing pipeline that offloads CPU-heavy color space conversion (YUV to BGRX) and allows the main event loop to maintain responsive, low-latency communication with the RunPod Scope API while delivering fluid video to Unreal Engine 5. I have also implemented a local TCP command server within the bridge, allowing UE5 to drive the AI generation parameters directly via Blueprint logic, closing the loop between the game engine and the cloud model.

Muxurana middleware

Muxurana middleware

The project architecture has evolved significantly from the initial brief to a cleaner, single-workstation topology. I have replaced the hardware-based HDMI capture workflow with a custom UDP networking solution: the Pulsar code editor now broadcasts raw text directly to a dedicated C++ Subsystem in UE5 for the "Code Wall" visuals, while Tidal Cycles sends semantic OSC data in parallel to drive the AI prompts. Visually, the scene has pivoted to a reactive "White Room" concept: a monochromatic, frosted-glass environment where Global Illumination and Subsurface Scattering are driven entirely by the incoming audio data and the colors of the AI-generated Aurora stream. This creates a deeply integrated audiovisual experience where the room itself acts as a visualizer for the code being written.

FrostBytes System Architecture Overview

FrostBytes System Architecture Overview

I recently finished the Unreal Engine 5 MuxuranaBridge and CodeDisplay subsystems in C++ to manage the data ingress between the game instance and the Pulsar/Python layers. I have also trained an Aurora Borealis style transfer LoRA for 1.3B video models that has drastically improved the appearance and motion for the generative AI window visuals. This puts me in a position where the technical "plumbing" is around 90% complete and allows me to shift my focus primarily to refinement on the visual and creative side of the project. I am confident in this optimized pipeline and look forward to demoing a fully functional "Minimum Lovable Demo" (featuring live-coded music driving real-time AI video and reactive scene lighting) at the mid-program check-in.

Frost Bytes - Final Update

Final Frost Bytes Scene

Final Frost Bytes Scene

"Frost Bytes" began as a synesthetic experiment to harmonize the rigid precision of live coding with the ethereal fluidity of generative AI, and it has evolved into a robust, nearly production-ready virtual performance platform. I successfully built a fully immersive Unreal Engine 5 environment where X4NTHA orchestrates a reactive world and generative AI video through live music code. By integrating TidalCycles and SuperCollider directly into the engine’s nervous system, I achieved a level of audio-visual synchronicity that goes beyond simple beat detection. The room itself breathes with the bass, the walls pulse with high-frequency  energy, and the weather outside shifts dynamically based on the density of the music notes, creating a cohesive atmosphere where the code is as visible as it is audible.

The technical brain of the project, the Daydream Scope generative AI video pipeline, represents a significant leap forward in real-time storytelling. I successfully implemented a system where the music's mood automatically constructs complex prompts, driving a remote RunPod instance running Daydream Scope. By leveraging the "LongLive" pipeline with a custom-trained Wan2.1 LoRA, I achieved a seamless, infinite video stream of a cyber-aurora that morphs in real-time. The system intelligently handles subject injection and audio analysis, allowing me to seamlessly transition the aurora into different shapes and colors purely through OSC commands sent from the live-coding environment.

A glance at a few of Frost Bytes' many UE5 Blueprints.

A glance at a few of Frost Bytes' many UE5 Blueprints.

However, the path to this final architecture was defined by significant evolution and technical pivots. My original middleware, "Muxurana," was initially conceived as a Python-based WebRTC/NDI bridge. While promising, I encountered severe friction with RunPod’s proxy firewall blocking UDP traffic, which made standard WebRTC negotiation impossible. This forced me to scrap the Python implementation and re-engineer the entire networking stack. I ultimately developed a custom Node.js signaling bridge paired with a native C++ subsystem utilizing Unreal’s Pixel Streaming 2 plugin. The RunPod system works, but I wouldn't consider using it over local GPUs for the purposes of live production or streaming as  I ran into issues several times where no GPUs were available for my Pod and migration wasn't permitted. The pivot away from Muxurana, while difficult, resulted in a far superior architecture that supports Trickle ICE, robust connection recovery, and zero-copy texture rendering directly on the GPU.

The result is a system that feels alive and resilient. I moved away from fragile, hard-coded logic to a state-driven C++ Director that smooths erratic data into organic curves. I implemented hotkey-based panic buttons (for actions like cache/pipeline reset), auto-healing connections, and a beautiful screen-like code Wall that visualizes the performance in real-time. What started as a visualizer has become a real instrument; the environment is no longer just a backdrop for X4NTHA, but an active participant in the performance, translating raw data into cozy light, textures, and real-time video with negligible latency. I'm also in the process of adding improvements: I have a starting implementation of a linear frame interpolation that completely smooths out the framerate to be comparable to the Frost Bytes environment itself, and extra light/material functions that help the room come even more alive.

I want to extend a massive thank you to the Daydream team for hosting this program. The access to extra Scope sessions and office hours were very thoughtful. I found that the exciting challenge of integrating bleeding-edge generative video into a real-time engine pushed this project far beyond its initial scope, but I was happy to put in some long nights so I could see it all come together. A huge thank you as well to the rest of the cohort: seeing your diverse approaches to AI art, music and storytelling was a constant source of inspiration throughout these past two weeks. "Frost Bytes" is just the beginning, and I can't wait to see where we all take this technology next, especially as technology continues to advance so fast that we can barely keep up!

Github Repo

Alpha source code, scripts, and workflow/UE5 tutorials for the project can be found on GitHub (WIP, lots of assembly required). Don't be afraid to reach out with any questions or feedback: this has been a true passion project and I don't expect it to stop here!

https://github.com/X4NTHA/frostbytes