Xiaoyunque opens Short Drama Agent with Seedance 2
A Turkish roundup says Xiaoyunque integrated Seedance 2 into a Short Drama Agent while outside access still depends on third-party services or workarounds. Creators can already use that fragmented access for train fights, SREF remixes, and old-image animation tests.

TL;DR
- A Turkish roundup thread says Xiaoyunque launched a Short Drama Agent on March 19 with Seedance 2.0 inside it, turning a script into AI characters, voice, storyboards, and scene-by-scene video.
- Outside China, the same roundup thread says access is still fragmented: no fully open global API, with many users relying on third-party services or China-facing workarounds.
- Creator tests already show why people care: one train-fight demo pushes fast camera motion and choreography on a moving train, while another pair of image-animation tests second test animates an older Midjourney still into short video.
- Seedance is also getting folded into style-driven workflows, with creators pairing it with Midjourney-style
--sreflooks such as the chrome-wasteland setup shown in an sref remix.
What opened in China
According to the roundup thread, Xiaoyunque's Short Drama Agent is the clearest productization of Seedance 2.0 so far: upload a script, then let the system generate character designs, voice, storyboard beats, and finished scene clips. For creative teams, that matters less as a model release and more as a compressed short-form pipeline aimed at serialized drama production.
The same post says this rollout is uneven. Seedance appears reachable inside Chinese platforms such as Xiaoyunque, but not as a broadly available official global API, which helps explain why many creator demos are appearing through intermediated tools rather than a single public destination.
What creators are already making
One early proof point is action blocking. In the train-fight demo, two characters fight through a moving train interior and exterior with continuous motion, impacts, and acrobatic beats that usually expose temporal errors fast. The result is still short, but it suggests Seedance is strong at keeping momentum readable across a complex shot.
Creators are also using it as an image-to-video finisher rather than a from-scratch generator. DavidmComfort's first test and follow-up clip start from an older Midjourney image and push it through Seedance via Yapper, showing a practical reuse path for dormant image libraries. Another workflow note comes from Anima Labs thread, which describes a node-based setup combining character import, Nano Banana conversion, reference shots, and Seedance animation tests inside one centralized graph.
Where the workflow still breaks
The most interesting technique trend is style transfer by handoff. In the sref remix, a creator pairs Seedance 2.0 with Midjourney's --sref 2156543800, using a look documented on the style page to steer a glossy post-apocalyptic wasteland aesthetic into motion. That is a useful pattern for artists who already have a stable still-image taste profile and want video to inherit it.
The constraint is access and moderation friction. The same roundup says unofficial routes often hit copyright or disallowed-content refusals, which means the tool may be easiest to demo right now in stylized original worlds, old personal image assets, and short proof-of-concept sequences rather than recognizable IP-heavy scenes.