Activity › Forums › Adobe Premiere Pro › The part of post-production I think AI actually helps
-
The part of post-production I think AI actually helps
Posted by David Bertolami on April 30, 2026 at 4:57 pmHey all — long time CC reader, first time posting here. Editor/producer, 20+ years in production (LA → FL).
Lately I’ve been thinking a lot about the review side of post-production.
Not editing itself, or even AI video generation, but the actual review process.
It feels like there are more edits now than ever now. More AI video means more versions, which means more notes, more cooks in the kitchen, more revision cycles.
And honestly, I don’t think AI really helps much with taste (or instinct, or storytelling overall).
But I do think there’s something useful in the idea of a cut getting a fresh set of second eyes – if even just to sanity check it – before the next human review.
So I’ve been building an AI tool for editors, producer and directors that literally is those second eyes. It’s called RoughCut AI
The basic idea is pretty simple:
- upload a cut
- RCAI watches/listens
- gives timestamped feedback back on pacing, clarity, continuity, etc.
What I’m paying attention to now isn’t whether people think it’s interesting.
It’s whether editors actually run it more than once. Is it “as valuable a tool” as I think it is?
If anyone’s working on short-form stuff and wants to try it out, I’m happy to share access. Not selling anything here, no cost involved, just genuinely curious where RoughCut AI fits (or doesn’t) in real workflows.
Mads Nybo jørgensen replied 18 minutes ago 4 Members · 8 Replies -
8 Replies
-
Eric Santiago
May 1, 2026 at 3:42 pmI would be interested in trying it on a five page script shot last fall.
We haven’t locked down which app to edit on, waiting on friends (writer/director) time availability.
-
David Bertolami
May 2, 2026 at 12:15 amThat actually sounds like a great fit for it.
Here’s the signup link if you want to get in early and play with it once you guys settle into the edit:
studio.roughcutai.app (currently optimized for desktop use. We are working on the mobile version, which isn’t ready yet!)
Once you’re in, DM me and I’ll send over a quick walkthrough/demo video too — helps orient people to the workflow pretty fast.
-
Devrim Akteke
May 2, 2026 at 3:56 pmHonestly, as someone who’s always open to technological innovations, I’m not entirely sure what to say about this. Recently, I made a video for someone I’d worked with before – someone incompetent who only directs for pleasure because he’s rich – and because he couldn’t comment on it himself, he uploaded the video to Chat GPT and sent the comments directly to me, asking for revisions based on them. The revisions were so absurd that I don’t want to write too much about them right now. But I also see that the days are approaching when technology will support us in this area as well, provided it matures enough to make accurate comments and demonstrate its value. I hope you have a bright future and continue to develop this technology correctly.
-
David Bertolami
May 5, 2026 at 11:42 amAppreciate you sharing that. And yeah, that kind of use of general AI is exactly why we built RoughCut AI originally. You hit the nail on the head, Devrim.
I’ve been using chatgpt for a while now as a second set of eyes for logos, photography, and other non-moving imagery and it worked surprisingly well there. Feedback was pretty consistent. But when I tried to do the same thing with a short video piece, chatgpt told me point blank it can’t watch videos. So naturally I was like “why doesn’t this exist?” Since then, chatgpt has evolved to analyze videos, but it’s still general use, unreliable, and not professional.
We need a tool for us, made for us, targeted to us, built by us. That’s RoughCut AI, thats the whole point.
It’s such an obvious tool in a time thats rapidly changing our industry, that if I didnt build it, someone else would. I mean, it just needs to exist. How many other editors out there could use a second set of eyes after 14 hours staring at the same timeline?
What we have built and tested so far… works. Really well actually. My CTO did quite an amazing job and my standards are high for this level of feedback and notes. Is RoughCut AI perfect? Not yet, but we are well on the way to carving out the path of being a reliable AI post-production supervisor/producer/director sitting next to you, collaborating on the edit.
If you ever feel like putting something through it, I’d be curious how it compares to what you’ve seen so far. Feel free to try it out: studio.roughcutai.app (currently, vid uploads are capped at 5 mins. In honor of transparency, yes, its free to try. 50 free credits at signup, good for a few rounds, definitely enough to feel it out. I’ve also put together a short tutorial/demo video I can link you to)
-
Mads Nybo jørgensen
May 5, 2026 at 8:06 pmHey David,
Your idea sounds interesting, and as you suggest from the outset, the A.I. will be able to speed up putting the clips in order (rough-cut). But it still requires the editor to make it flow.
But if what you are looking for is the rough-cut itself, and changing that. Then I am not sure what I would achieve by sharing it, unless you have all the footage uploaded, so the platform can select better cuts or angles for the client and me to review?
There is no doubt about that over the next 5-years we will experience a revolution in video editing and finishing combined with A.I. tools.
We are already seeing Vibe-Coding coming in to play inside the NLE, and I suspect that if Adobe is smart. The “script” feature known as “After Effects Expressions” will soon be Vibe-Coding.
Avid has already announced at NAB, that with Google, they are bringing Generative and Agentic AI into Mediacomposer. Forget Generative, and focus on Agentic inside the editor, and there is something very big happening right there:
https://www.avid.com/press-room/2026/04/avid-and-google-bring-agentic-ai-to-media-productionMy money is on local A.I. processing, where the “Agents” are installed and processing is done locally on my machine, rather than in the cloud using Adobe style “Pay-as-you-go”, or similar A.I. tokens from other vendors. Which also reduce the need for A.I. data centers to be built.
With Apple currently running behind the super-fast train trying to catch up on A.I. (they have paid Google $1Bn to help them get going), that market has been cornered by NVidia (Windows & Linux).
If you can create a plug-in running inside Premiere Pro, offering up differet versions of a cut, then that might be interesting, if not fun to watch. If possible, maybe find out if you can “crack open” Premiere Pro’s media analysis .prmi files and use that info to improve the edit?
This is something that Social Media and other short-form projects could right now benefit from.Just an opinion.
All the Best
Mads -
David Bertolami
May 5, 2026 at 8:21 pmHey Mads, I really appreciate the thoughtful reply.
Just to clarify one key thing, because I may not have explained it clearly:
RoughCut AI is not a “vibe editor”. It doesnt generate video or edit for you.
It’s “Studio-level notes and feedback from AI that watches and listens to your edit”The idea is: once you already have a cut — rough or otherwise — it watches and listens to that version and returns timestamped notes on things like pacing, clarity, continuity, etc.
So it’s closer to a structured review pass than anything happening inside the timeline. The same way a producer or second set of eyes would.
I agree with you on where things are heading inside the NLE with agentic tools, that’s a different (and very big) direction. What I’m focused on right now is the part that happens between versions, where the feedback and notes happen. Think of it like an ai-powered frame.io
I appreciate you taking the time to think through it, and if you ever feel like running a short piece through it, I’d be curious how it compares to your usual review process.
-
Devrim Akteke
May 6, 2026 at 8:04 amWe should try to shape the future of technology together as people working in the industry.
I will try to give it a shot and share my comments. Thanks.
-
Mads Nybo jørgensen
May 6, 2026 at 12:31 pmHey David,
OK, I hear you:
“The idea is: once you already have a cut — rough or otherwise — it watches and listens to that version and returns timestamped notes on things like pacing, clarity, continuity, etc.”On paper, that sounds great.
In reality, on bigger projects, my client get a Transcript with TimeCodes from me in a word.doc format. Where they can follow and make notes.
I’ve just come off a project with a running time of 110 minutes, where most of the edit, from rushes to final master, was “paper-edit”.
The pacing and clarity often becomes apparent during the edit and is changed on the fly as we go along.
On Continuity, there is one former Director General, and the former head of news, at the BBC that might have wished for that.
https://www.bbc.co.uk/news/articles/c874nw4g2zzo
If anyone ask, you heard it here first; but if there was a visual tool that could show you where the clips in a timeline was from in the rushes. And, if two similar looking and/or sounding cuts might create a potential conflict because of the content, or placement in related to other content. Then you might have a winner.
Although, rather than in the cloud, you’ll have to be at local media server level. And the audio/video data, inside a international broadcaster, is astromically large.
But there is the argument of having a “forensic” trace to the origin of any given clip, which both the production, the compliance producer (before publishing), music licensing and the legal department would be happy to have + any future DG in the “Line of Fire”.
Hope this might help you to find new riches with your tool, please don’t forget me! 😁
Atb
Madsbbc.co.uk
BBC apologises to Trump over Panorama edit but refuses to pay compensation
Lawyers for the US president have threatened to sue the corporation for $1bn (£759m).
Reply to this Discussion! Login or Sign Up
