Launched this week

Luma Agents
Agents that plan, iterate, + refine w/ full creative context
471 followers
Agents that plan, iterate, + refine w/ full creative context
471 followers
AI agents that plan, generate, and iterate across video, image, and audio in one pipeline. Brand campaigns, product visuals, social ads, video localisation with shared context end-to-end. For creative teams and agencies.






Luma Agents is a creative agent platform that plans, generates, and iterates across video, image, and audio within a single shared workflow.
I'm hunting this because the gap it's addressing is structural, not just a feature gap most AI creative tools are isolated generators, not pipelines.
The problem: Creative teams at agencies and studios are stuck stitching outputs together across tools. Every handoff is a restart. Context gets lost. Scale means adding headcount.
The solution: Luma Agents embeds context across every stage of a project concept to delivery. Agents see what you see, carry that context through video, image, and audio generation, and iterate without you re-explaining from scratch.
What you can do with it:
🎬 Run a full brand campaign with cohesive visuals and variants across formats
📦 Generate e-commerce product shots lifestyle, hero, on-model in one workflow
📱 Produce short-form video ads with hooks, captions, and platform-specific framing
🌍 Localize videos with natural voiceovers and synced visuals across languages
Who it's for: Creative directors, brand teams, and agency producers who are already using AI tools but spending too much time wrangling outputs between them. Also relevant for solo creators who want to produce at team-level volume.
Time will tell whether teams are going to use this to replace specific tools in the stack or layering it on top of what they already have.
P.S. I hunt the latest and greatest launches in tech, SaaS and AI, follow to be notified → @rohanrecommends
@rohanrecommends great hunt Rohan been using many tools to kind of stitch a solution like this one together. Great stuff
@bondig1 Appreciate you sharing that! You’re exactly who this is for, so it’s great to hear it resonates with how you’ve been stitching tools together. Thanks for taking the time to drop by and comment.
The "shared context end-to-end" angle is what makes this genuinely interesting to me.
Most AI creative tools treat each output as a fresh start — image, then video, then copy, all disconnected. The result is creative that looks like a ransom note: technically competent, visually inconsistent.
Building ad-vertly.ai, we obsessed over this same problem from the advertising side. A campaign should have a through-line: same brand voice, same visual DNA, same audience understanding — whether you're running a static banner or a 15-second video. The moment you break context, the creative stops feeling like a brand and starts feeling like a vendor.
The market that'll love this first: performance marketing teams at agencies where speed-to-creative is the bottleneck. The question is whether the iteration loop is tight enough to replace the current "generate, export, feedback, regenerate" cycle that kills time.
Really excited to see where this goes. Congrats on the launch!
@gaurav_singh91 Keeping a single through-line for voice and visuals across every asset is exactly what Luma Agents are aiming for with Agents. Thanks for stopping by!
Works best for product teams with repeatable assets, consistent brief and clear output. I am not sure this works for service-based business. The pipeline thinking is solid either way.
@rajanbuilds Repeatable assets with tight briefs are where this really sings, and I'm equally keen on how far it can stretch for more bespoke service workflows. Thanks for sharing your take here!
The "shared context" piece is what kills every creative workflow I've seen. You finish the video, send it to copy, they produce something totally different in tone, then back to square one. Curious whether the context travels across handoffs between team members or just within a single session?
@dklymentievYou nailed the pain. The vision here is that context should travel with the project, not stay stuck in one step or person’s head. Thanks!
mostly layering first - teams dont pull out existing tools until the new one handles edge cases reliably. curious what a full replacement workflow looks like.
@mykola_kondratiuk Totally agree, most teams will layer this in until it proves itself on the edge cases. Thanks for sharing your comment. :)
right, it's basically an audition period - every team needs a few edge cases before they stop babysitting it.
@mykola_kondratiuk Haha yes, that “audition period” is real, edge cases are where you really find out if something can be trusted without hand-holding.
@adeolatona That's a good feedback. I hope Luma team is taking notes. :)
Is this release from this company? https://luma.com/
@lenaavramenko Great question, like @sebastian_munteanu mentioned, this one is from the Luma team at lumalabs(dot)ai focused on these creative agents specifically.