Can You Algorithmically Engineer Authenticity? On AI, Human Storytelling, and Knowing When to Put the Tools Down
- Osobarra Films

- Feb 17
- 7 min read
Updated: Mar 7

Let me tell you what AI did for me last week.
I had four hours of documentary interview footage — raw, rambling, occasionally brilliant, occasionally someone asking where the bathroom was. I needed specific testimony about a single night in 1968. In the old days, that meant a full day in an edit bay, earbuds in, coffee going cold, scrubbing. This time, I ran the transcript through an AI tool and had every relevant quote surfaced, time-coded, and cross-referenced in under twenty minutes.
It was, I'll admit, kind of miraculous.
And yet. The moment I started to let that same tool shape what those quotes meant — to let it decide what the story was — I felt something go wrong. Not catastrophically. Quietly. Like a scene that's technically perfect and emotionally inert. Like a frame in focus that has nothing to say.
That's the tension I want to talk about. Not AI versus filmmaking. I'm past that argument, and frankly, so is the industry. The real question — the one nobody seems to want to sit with very long — is this: can you algorithmically engineer authenticity?
My gut says no. And thirty years in edit bays, on sets, and inside the specific silence that falls over a room when a story lands — that gut has earned a hearing.

The Promise (Which Is Real)
Let's not do the thing where we pretend AI is just a fad or a threat. It isn't. It's a revolution in how creative work gets done, and the people who refuse to engage with it are going to find themselves left behind by the people who engage with it wisely.
For production work, the efficiencies are genuinely staggering. AI tools are streamlining editing workflows, supercharging research and analytics, generating visual pre-production concepts that used to require a full art department, and making it possible for a filmmaker working from a laptop to visualize a scene that once would have demanded a six-figure budget. Alfonso Cuarón — not exactly a slouch in the filmmaking department — has described AI as something that can "free up mental space" for storytelling by taking the repetitive technical work off your plate. That's not a promotional tagline. That's a working filmmaker describing a real shift in how stories get built.
I use AI for research. I use it to break down interview transcripts and find specific quotes fast. I use it to stress-test story structure, brainstorm visual approaches, and push an idea until it breaks so I can see where it's actually weak. These are legitimate, powerful applications that make me a better filmmaker, not a lazier one.
John Lasseter put it cleanly: "The art challenges the technology, and the technology inspires the art." That's the partnership at its best. Two things feeding each other.

The Problem (Which Is Also Real)
Here's where it gets complicated. AI is, at its core, a pattern-recognition engine. It's extraordinarily good at identifying what has worked before — what has resonated, what has trended, what has earned engagement — and then producing more of it. Optimized. Refined. Averaged.
And therein lies the problem with using AI to engineer storytelling. Because the stories that last — the ones that matter, the ones people watch six times and quote to their kids and cry about in dorm rooms at 2 a.m. — those stories almost never look like what worked last time. They're specific. They're strange. They're the product of one person's irreducible, untransferable point of view. They contain the mess.
Researchers have raised a serious flag about what they're calling "aesthetic homogenization" — the very real risk that overreliance on AI-generated content produces a kind of flattening effect, where everything starts to look and feel like everything else because it's all drawing from the same optimized well. An algorithm trained on what's already worked will tend to reproduce what's already worked. This is not a bug. It's a feature. And for storytelling, it's a disaster.
Werner Herzog put something on the table years ago that I keep returning to: "I think the worst that can happen in filmmaking is if you're working with a storyboard. That kills all intuition, all fantasy, all creativity." He wasn't talking about AI. He was talking about over-planning. But the principle holds. When you remove uncertainty from the creative process — when you replace human instinct with optimized recommendation — you don't just change the process. You change what you're capable of making.
The Coca-Cola AI Christmas commercial from 2025 is the cautionary tale nobody should need to have explained twice. A brand with a century of earned emotional resonance handed its most important seasonal content over to a generative AI tool. The result was technically proficient and emotionally hollow. It looked like a Christmas commercial. It felt like someone who had watched a lot of Christmas commercials and decided to make one. As Georgia State filmmaker Strickler put it afterward, it seemed like the brand "might not have fully considered why they chose AI or its implications" — and that "like all art, using AI effectively requires intention, authenticity and alignment with the essence of the work."
Intention. There's the word. AI has no intentions. It has outputs.

The Messiness Problem
I've been working on a documentary built around a single night in 1968. A night that cracked American history open and left a wound that never fully closed. The eyewitness at the center of the film is someone close to me — someone who was in that room, who saw what happened, who gave testimony, and who has been carrying the weight of it for more than half a century.
Only now are they ready to talk.
No AI can reconstruct what it costs to tell that story. No algorithm can identify the exact pause before a certain name gets said, or the particular way someone looks out a window when they get to a certain part, or what it means that fifty-six years have passed before the words were ready to come. These are not data points. They are the story. The mess, the weight, the survival — that's what the audience feels. That's what makes a documentary more than a very expensive Wikipedia entry.
Ryan Coogler said it plainly: "A filmmaker's most important tool is humanity. You want to be able to capture humanity in your stories and bring out humanity in your characters."
Humanity is not something you can prompt-engineer. It accumulates from lived experience, editorial instinct, and the willingness to sit in a room with something uncomfortable long enough to understand it.
Documentary filmmaking specifically runs on audience trust. When Netflix's 2024 film What Jennifer Did used AI-generated imagery to reconstruct events, critics argued it "blurred the line between fact and fiction" in a genre where that line is everything. The audience doesn't just watch a documentary. They make an agreement with it. AI imagery, when undisclosed, breaks that agreement. And once broken, it's very hard to repair.
The Distinction That Matters
Director Joe Russo said recently that AI has the potential to "engineer storytelling" — to allow filmmakers to rapidly iterate narrative ideas or create entire stories from prompts. He called it both exciting and unnerving. I think unnerving is doing a lot of work in that sentence and deserves more than a passing mention.
Because there's a distinction that rarely gets made clearly enough in these conversations, and it's the one that matters: AI as instrument versus AI as author. The former makes you a more efficient, better-resourced storyteller. The latter makes you a curator of someone else's output. Both can look identical from the outside. Only one is actually filmmaking.
The test I keep coming back to is a simple one: where is the authorial judgment? When I use AI to surface quotes from four hours of footage, the judgment — what those quotes mean, how they fit together, what the film is trying to say — that's still mine. If I were to use AI to generate a rough cut based on engagement metrics and predicted audience response, the judgment belongs to the machine. And the machine, however sophisticated, has never been in a room where something real was happening.
James Cameron has warned about an "arms race" in AI adoption, studios competing to leverage new capabilities faster than their competitors, creativity sacrificed for efficiency. He's right to flag it. The pressure to move fast, produce more, and reduce cost per content hour is real, and it's intensifying. The filmmakers who resist that pressure, not by rejecting AI, but by being ruthlessly intentional about where they use it, are the ones who will still be making things worth watching ten years from now.

What the Partnership Actually Looks Like
I want to be clear, because I've watched this conversation collapse into false binaries too many times: this is not an argument against using AI. It's an argument for knowing exactly what you're using it for.
The strongest framework I've found is this: AI handles what you can measure. Humans handle what you can't.
You can measure how long a viewer watches a scene before reaching for their phone. You cannot measure what it means when they don't reach for it. You can measure which interview clips test well with a target audience. You cannot measure the cost of using the clip where someone looks away instead of the one where they hold the camera's gaze. You can measure keyword density and engagement rates and optimal video length. You cannot measure why a story stays with someone for thirty years.
AI content strategists will tell you that the highest-performing content in 2025 combines AI efficiency with human insight — that when AI sets the foundation and humans supply the truth, content stops feeling manufactured and starts feeling trusted. Google's own algorithms have shifted toward rewarding lived experience, original perspective, and what they call E-E-A-T: Experience, Expertise, Authoritativeness, Trustworthiness. In other words: the search engine is trying to identify human fingerprints on content. The machine is looking for evidence of a person. That's both ironic and instructive.
The filmmakers who will thrive are the ones who use AI as a spectacular research assistant, a tireless first-draft machine, an analytical partner — and then know precisely when to put it down and pick up the thing that no algorithm can replicate: thirty years of instinct, a specific human story, and the willingness to be present in the room when something true is happening.
The Thing AI Can't Do
Nia DaCosta said she wants to "tell good stories in ways that will shine a light on lives rarely seen on screen, because stories can push humanity forward." That's a mission statement, not a prompt. It requires vision, values, and the kind of specific human commitment that doesn't emerge from training data.
AI can optimize a story. It cannot have one.
It can surface the moments. It cannot feel the weight of them.
It can identify that a pause before a name is statistically unusual in documentary interviews. It cannot know what that pause costs someone who has been waiting years to say what comes next.
That's not a limitation to work around. That's the line. Know where it is. Respect it. And then, on the other side of it — use every tool at your disposal to tell the story as well as it can possibly be told.




Comments