Creator demos show Soul Cast generating cast candidates inside Higgsfield Cinema Studio, then placing those characters into scenes through Nano Banana references. Watch it if you want casting and shot planning in a more structured preproduction workflow.

The launch demo centers on a Cast tab inside Higgsfield Cinema Studio, available through Cinema Studio, where creators can build a character by selecting broad production attributes instead of prompting from scratch. The video walks through a female, age-30, 2020s, $250M setup and lands on a named character page for Mariana Cruz, where the interface says exclusive rights can be secured.
ProperPrompter's follow-up post adds the practical next step: use the generated actor and an existing avatar as dual references in Nano Banana 2 to place both in the same restaurant scene. The thread also claims lighter constraint-setting produced better results than over-specifying every field, with the creator relying on randomize first and then making small edits.
ARQ's tool breakdown is useful because it treats character generation as one step in a larger pipeline. ShotDeck is used for frame study before prompting; Qwen 3 VL for shot-by-shot video analysis; Gemini 3.1 Pro and Claude Opus 4.6 for script and prompt development; Nano Banana Pro for locked references across 300-plus shots; Kling 3 Pro for motion from stills; and Reve for environment comps.
That pipeline mindset matters because ARQ's music video breakdown says one project generated 884 shots across three runs, with only 90 making the cut and each finalist getting a custom motion prompt. Soul Cast looks most relevant at that front end, where consistent characters and rights-managed casting matter before a team starts iterating across hundreds of downstream images and video shots.
This is getting awkward for Hollywood 😬 You can just generate actors with exclusive rights to cast in your AI films. Here's how with Soul Cast from Higgsfield: #higgsfieldpartner
To get my new character in a scene with my avatar, I used them both as references in Nano Banana 2. Raw output:
All the tools we use to make films for Top Tier Brands like: @tether @rumblevideo ShotDeck - 1M+ frames from real films. Search any shot. Study it before you prompt it. Qwen 3 VL - Feed it any MP4. It watches the full video, breaks down every shot. Your personal video Show more
the demand for knowledge is at an all time high. only issue is that the amount of knowledge available is at an all time high too. so from my experience majority of people are stuck in a loop of learning but never get to actually execute. i have a solution for this.
We generated 884 shots across 3 pipeline runs for one music video. Only 90 made the final cut. Each final shot got a custom dynamic motion prompt. Camera movement, subject motion, ambient layers. All hand-directed. Tomorrow I'm giving away every single shot, prompt and entire Show more
“Tethered Together Forever” - The Humans We made the first official song + music video for @tether Watch it. pic.x.com/JrzaROIbH6