Adobe Firefly rolled out Precision Flow beta and AI Markup inside Edit, letting users prompt specific regions and parameters instead of regenerating full images. Access is staged and generations still consume credits, so test the workflow before moving production edits.

You can see Adobe's official framing in its March image-editing announcement, jump straight to the Firefly prompt-edit docs, and the weirdly useful detail in this screenshot is that Edit already surfaces partner-model controls and a visible per-generation credit cost.
Adobe's pitch is simple: describe a change, point to the part of the image you mean, and keep the rest intact. For Firefly users, that is the closest thing yet to local prompting instead of full-canvas roulette.
In Adobe's March announcement, the company said it was adding AI editing tools to Firefly Image Editor and described AI Markup in Photoshop web as a way to draw on the image and control exactly where changes happen. A follow-up reaction post boiled that down more bluntly as "Photoshop + prompting," which is basically why this rollout stands out.
The strongest demo is not the lens flare clip. It is the label-based edit flow.
The workflow in that demo is three steps:
That lines up with Adobe's AI Markup help coverage for Photoshop web, which lists both AI Assistant and AI Markup in the current web workflow, and with Adobe's AI Markup announcement, which says users can mark an area and attach a prompt to it.
The important shift is control granularity. That commentary post says "no more hoping the AI gets the whole image right," and the demo earns the line.
Adobe did not hide the interface logic. The Edit screen is already exposing the tool stack in plain sight.
From that screenshot, the current Edit view includes:
Adobe's Firefly help page for editing with text prompts confirms that prompt-based image editing sits alongside Generative Fill, Remove, Expand, and Upscale in the web app. Adobe's partner-model documentation also says Firefly now includes non-Adobe models across image workflows, which helps explain the partner-model control visible in the screenshot.
This rollout is live, but not uniform.
According to the primary launch post, Precision Flow beta and AI Markup were rolling out on April 7 and should land globally over the next few days. The access screenshot points users to firefly.adobe.com/generate/image?view=edit, with the Edit tab highlighted, which is the cleanest evidence of where the feature is surfacing.
The other concrete detail is cost. That same screenshot shows Generate consuming 5 credits for an edit, and Adobe's generative credits documentation says Firefly and other Creative Cloud generative features are metered through Adobe's credit system. That makes this a better editing control story than a freeform toy story.