Stories, products, and related signals connected to this tag in Explore.
OpenAI said it is saying goodbye to the Sora app and will share timelines for the app, API and work preservation later. HN discussion says developer access and ChatGPT video features may also wind down, so creators should plan how to archive Sora projects.
DualShot Recorder hit the top paid App Store slot by promising simultaneous portrait and landscape capture in one take. Early chatter says the idea is obvious and commercially potent, even as users complain about basic camera controls and broken 4K.
Tutorials show Calico turning listing photos and a Zillow link into 20 to 60 second narrated walkthroughs, then pairing them with AI virtual twilight exteriors. Use the workflow to bundle scripts, music, captions, and upsell stills in minutes for low credit spend.
New Multi-Shot demos showed Runway turning short prompts into 15-second dirt-bike chases, forest ambushes, and dialogue-led sequences. The examples make the web app easier to read as a prompt-to-scene tool, though evidence is still mostly creator-side tests.
Runway's new web app turns a prompt or starter image into a cut scene with dialogue, sound effects and shot pacing. Creators can now block whole sequences instead of stitching isolated clips.
Seedance 2.0 is now showing up across CapCut Video Studio, Dreamina and Pippit with multi-scene timelines and shot templates. Creators can use it to move from single clips to editable long-form production.
Topaz says Starlight Precise 2.5 improves realism, cuts plastic-looking artifacts and upscales AI video to 4K in Astra, partner apps and API. Use it as a finishing pass when generated footage needs cleanup.
CapCut is expanding Dreamina Seedance 2.0 while Topview restored access within 24 hours, and creators are stress-testing it for vertical repurposing, long prompts and stylized start frames. Try it for fast video conversions, but budget cleanup passes for continuity and transitions.
Riverside's Co-Creator reads transcripts automatically and turns chat-style requests into cuts, captions, thumbnails and social copy from one workspace. Use it when you need fast repurposing without timeline scrubbing, then polish the output by hand.
OpenAI said it is shutting down the Sora app and will share timelines for the app and API, plus instructions for preserving work. Creators should export assets and test replacement tools now if they built remix-heavy video workflows on Sora.
Freepik launched Relight in Pikaso, letting creators transfer lighting from a reference and adjust direction, intensity and color for images and video. Browser-based relighting is moving from rough correction into controllable look development, so test it for production lighting passes.
SentrySearch uses Gemini's native video embeddings to index footage without transcription, find matching scenes fast, and trim clips automatically. Editors can move from natural-language search to selects, rough cuts and future EDL exports with less manual logging.
A shared workflow converts GTA-style stills into photoreal images with Nano Banana 2, then animates them in LTX-2.3 Pro 4K using detailed material, skin, vehicle, and camera prompts. Try it for trailer-style previsualization if you want more control at lower cost.
Topview added Seedance 2.0 to Agent V2, pairing multi-scene generation with a storyboard timeline and Business Annual access billed as 365 days of unlimited generations. That moves longform video workflows toward editable sequences instead of stitched clips.
Seedance 2.0 is rolling out through Dreamina on CapCut desktop and web, starting in Southeast Asia plus Brazil and Mexico. Watch region-gated access if you need it now, since U.S. availability is still delayed.
WAR FOREVER released a four-minute D-Day sneak peek, set a June 6 release date, and opened distribution inquiries through NAKID Pictures. Watch it as a benchmark for longer-form AI war scenes where sound and art direction do the heavy lifting.
Users showed Calico turning listing photos plus a property URL into scripted voiceovers, music, image-to-video clips, and captions for about $12 in credits. Try it if you sell marketing deliverables and want a faster way to package real-estate promos.
New Light Studio demos show drag-and-drop browser relighting with dual lights, color-temperature control, presets, and fast mood shifts on plain footage. If you shoot simple plates, this is becoming a practical way to turn one clip into multiple looks without a full VFX relight.
Freepik added Magnific Precision controls to Video Upscaler, including 4K output, a 12-frame preview, and sliders for sharpness, grain, strength, and FPS. Preview first, then push settings only after you know the texture treatment survives motion.
SparkVSR lets you super-resolve a few keyframes and propagate that look across the whole clip, with a reported 24.6% CLIP-IQA lift over baselines. That gives restorers and AI video editors more control than one-click blind upscaling when texture fidelity matters.
SAMA is a new 14B open model for instruction-guided video editing that separates semantic anchoring from motion alignment and claims state-of-the-art open results. Track it if you need edits that change objects or style without wrecking motion.
Adobe Firefly now runs Kling 2.5 Turbo inside Firefly and Firefly Boards, and creators quickly posted first tests from the integrated workflow. It keeps image, video, and audio work in one Adobe stack instead of hopping between apps.
Dustin Hollywood published War Forever Part One on Escape and followed it with featurettes teasing Part Two for June 6. The rollout is becoming a live case study in how AI filmmakers can serialize longer work instead of stopping at trailers.
BeatBandit added a full NLE editor so scripts, shot lists, character setup, video generation, and editing can stay in one app. MultiShotMaster also arrived in-browser with 1-to-5-shot generation and node-graph chaining, so test both if you want faster narrative iteration.
Hailuo launched a unified Workspace that keeps image, video, and audio assets in one project flow, and it also teased a 3.0 model for March. Try it if your current pipeline is split across too many tabs, exports, and handoffs.
Stages AI previewed a patent-pending bridge editing system that links shots by motion, color, subject continuity, and screen direction instead of standard transitions. Watch it if you care about AI-native editing tools, not just generation.
InVideo released Dynamic Captions with animated word-by-word styles, custom fonts and colors, and Safe Zone presets for TikTok and Instagram. Apply them early in the edit if captions are carrying retention and platform framing.
Medeo Video Skill released an open-source OpenClaw setup that lets users generate video by chat, add assets, and run jobs asynchronously after a quick API-key install. Try it if you want text-in, video-out workflows without switching across dashboards.
Stages AI updated the VIDX editor with tracking, rotoscope, grading, keyframing, and speed-ramping tools, while users also showed Connect handoffs into OpenClaw and Blender. Use it if you are building an AI-assisted post pipeline instead of relying on one-off generations.
A shared workflow showed how to build a character with Nano Banana 2, generate extra shots, and feed Suno song segments into LTX-2.3 for synced clips. Try it to turn one track into a finished teaser without manual keyframing.
DreamLabLA posted a finished crash shot alongside a behind-the-scenes breakdown covering planning, setup, and final compositing with Luma Agents. Use the paired clips as a template for where generative agents fit inside a practical VFX pipeline.