
Check out the latest model drops and powerful integrations.
Following up on yesterday's workshop, I wanted to put together a proper walkthrough of what we showed - because this one's a bit different from our usual real-time video generation.
Two big things in this episode:
1. Scope now supports plugins
This is still in preview (APIs will change, things might break), but it works. The plugin architecture opens up Scope for extensibility - developers can bring new models, new capabilities, and new interactions into the platform. This is a natural step in Scope's evolution, and we're excited to see what people build with it.
2. World models are here
If you've been using Scope with Krea Realtime or LongLive, you know the flow — give it an image or prompt, get real-time video output. World models flip that around. Instead of generating video, you're generating a navigable 3D environment. WASD to move, mouse to look around. It's like being inside an AI-generated video game.
The scope-overworld plugin brings Overworld's Waypoint-1-Small model into Scope. In the video I walk through the full installation and then explore a few different seed images — including walking on Mars and wandering through a cozy origami Christmas scene I generated with LongLive in a previous episode.
What you can do with this:
The model was largely trained on first-person shooter environments, so seeds that match that aesthetic tend to work best. And yes, there are artifacts - this is the smallest model, early days. But that's the point of building in public. Things will get better.
Here are some links and valuable resources to get you started:
If you try this out and create something cool, drop it in the hub - I want to see what you make.