Filmtools
Filmmakers go-to destination for pre-production, production & post production equipment!
Shop NowHowdy, partner!
Welcome to the wildest ride in the wilderness. No, not the Thundermountain Railroad, or The Oregon Trail (more on that soon). I’m talking about the current creative reality, one fueled by Adobe Firefly, now fully integrated into everything Adobe has to offer.
The Oregon Trail (the classic computer game, not the real trek) is not unlike some technology conferences: difficult to master, unknown terrain, required preparation with food (and coffee and water and more coffee), smart packing, and of course, ending with a great story.

It feels like light-years away when we had air-conditioned portables full of computers containing the afternoon’s wagon travels on the trail through software. While I was in elementary school, I’m not sure I could have ever predicted that we would be in a world of AI Agents, where I can essentially (once I get off the waitlist) create a transition with a prompt in the new Firefly Video Editor.
Adobe MAX keynotes take place in the Peacock Theater next to the Los Angeles Convention Center, home to the Emmys and the American Music Awards, as David Wadhwani, President of Digital Media Business at Adobe, reminded me. It is also the place where I learn about “conversational experiences powered by AI agents,” now implemented into the Adobe world, a world away from the Oregon Trail.
During that morning keynote at Day 1 of Adobe, I glanced at my watch. At 24 minutes in, I realized I wasn’t sure if I had seen a human in the footage I was looking at. I wasn’t sure which was AI-generated content and what was only human-generated, and what was in-between. And what popped in my head was a true millennial moment influenced by good old car radio songs: “Where have all the humans gone,” sung to the tune of Paula Cole’s “Where Have All The Cowboys Gone.”

It is Adobe’s hope that AI, once described as the wild west, is now a catalyst to help you create the conditions of your own personal and creative “wild west.” We harness our creative energy (get on our horses), create something from nothing (a desert oasis), tap into our pioneering skills (exploring), make a mark so people know who we are years from now (document our journey), create a lay of the land (a map), and moderate our threats (famously dysentery). With AI as a “partner,” the hope is we can move through our own trail faster, with the tools we already have. But it does require a full embrace of the tool.
A spaghetti western, an iconic set of films from the 1960s and 1970s set in the Old West and led by Italian directors, has a distinct look, poster, and sound. And I was curious: what might the soundtrack of the west at Adobe MAX sound like?
Lucky for me, Generate Sound in Adobe Firefly can help me create a fitting score. With a few photo uploads and a prompt, I was off and running with my own commercially safe Western score. I created a quick slideshow of my photos from MAX and prompted Generate Soundtrack with “A Spaghetti Western song with Cowboy music style for a conference.” Here’s how it did:
Yee haw!
And of course, I also needed Firefly to help create a spaghetti western poster, which is the featured image of this article. When you visit Firefly to create an image, Google Gemini Nano Banana is the default image model. In this case, it was by far the best model to create an image as it successfully handled the text I requested.

But is this town big enough for both of us, Humans and AI? Picture it…the classic stand-off in a Western film between two adversaries. Will they ever be able to coexist in the same place? Or is there a world where we can become partners?
AI is certainly here to help tame the wild wild west of our creative lives. Where AI excels the most, and where creators get the most excited, as determined by the very scientific metric of applause during a curiously mild response to the keynote, is when it functions as a tool to help streamline our processes. Efficiency is lassoed.
Take the wild wild west of thousands of images. It’s a process that wedding photographers know all too well: the time-consuming process of identifying the best shots from way too many photos. But now, in Lightroom, AI-assisted Culling can quickly help find photos in focus or when your subject has their eyes open. A demo of the AI Assistant in Photoshop, and using it to rename layers, received the most applause of all the keynote announcements. And if you happened to open your camera in the dusty west and got some specs on your sensor, Dust is now available as a Distraction Removal in Lightroom. That demonstration also received applause, prompting a “Keep in mind you just clapped for AI. Love that” from Terry White, Principal Director, Creative Cloud Evangelist, and Community Advocate, Adobe.. It was the largest burst of applause in a rather subdued opening session (one presenter even called out the “one clap” for Premiere for iPhone).

With the implementation of Topaz Labs, Generative Upscale allows users to quickly adjust and improve the quality of an image. Paul Trani, Principal Director Evangelist at Adobe, showcased the new feature with a moment that all of us may be asked to do: fix and upscale a (hilarious) family portrait. Alongside the laughter was a deep understanding and kinship of what this can mean for you and your family; a deeply human moment symbolizing your bonds. Your gang, if you will.
At the heart of all of the AI experimentation and implementation has to be a human. We can tame and ride our wild wild west from the human side, and solve alongside AI, and combine that with real, authentic, and genuine moments. And with perception, we can still find the humans and where they have gone. They are not hiding; they are along the trail at MAX.

Emotions continue to find their way into the braided threads of Adobe MAX. Sometimes, walking into a loud arena playing awesome music makes me emotional (it happens at MAX. It also happened at Lambeau). This year, my primary lens was not an automatic one, but a vintage Carl Zeiss Flektogon from the 1970s. I love being a little bit different. I love the individual character that vintage lenses bring; uniqueness is a quirk of humanity.
Walking through the Creative Park, your ears would have perked up from laughter in the ball pits, or delight from the puppies… or at least I assume delight at the puppies: They were at nap time. Damn! Also, I can take that cuss out now in Premiere with a new feature and can even swap it out for a duck quack. As Dacia Sainz, Sr. Quality Engineer and Animation Explorer at Adobe, said during the first Premiere reveals: “buckle up, boos!”

Problem-solving is at the core of the AI exploration, but it is also at the core of the companies on the floor at the Creative Park. LucidLink, which solved the “what can I do besides downloading all this footage and opening up the footage and relinking it all” problem, has now also solved the “what if I want to use Frame.io and go straight from there to my timeline without downloading” problem. Coming in 2026 is a Frame.io integration with LucidLink. As explained by Steven Niedzielski, Solutions Engineer and Creative Workflow-oligist, “it’s not just camera to cloud, it’s camera to timeline now.” And, with snapshots, even if you misplace an item, you can go back and find that file in their system. Snapshots create that little marker in time, one we alluded to earlier: “I think of it as like a parallel universe. Like it’s going to show me all these different moments in time that I can go back into.” And with LucidLink, yes, this town is not big enough for the both of us: if I’m currently editing a project, it will lock you out.
Frame.io’s camera-to-cloud integration can now also be found in the new C50, displayed at the Canon booth at MAX. With a 7K open gate sensor that can simultaneously capture a cropped vertical resolution file, ready to edit on Adobe Premiere for iPhone, the impressively miniature unit is ready for our human hands.

And the humans, of course, are found here too at MAX. They can be found warmed by the Los Angeles sun, beating down on me as I snuck a view of the traffic on the 110 freeway, predictably slow (giddy up!). It can be found in the deeply intentional 2025 Cannes Palm d’Or winner “It Was Not An Accident,” cut on Adobe Premiere and shared through a special screening for the press. The viewing, held just across the street at the Regal, still felt like a subdued and secretive world away from MAX, complete with a fitting soundtrack in the Regal Lobby Market place: The Tears for Fears version of “Mad World.”
Yeah, it might be a Mad World, but it’s our world. And we embrace it with a lasso. This is where the speed from thought to thing has never been faster and has never been more ours. So is the town big enough for the two of us? Or can we partner up, join the gang together, and conquer the wild wild west?
Can AI, in the theme of the wild west, actually be our partner?
As I listened to The Human-Centric Future of AI-Powered Brands on the Creative Park Discovery Stage, Brian Yamada of VML reminded us that companies creating Agentic AI are “racing to become your partner.” Deepa Subramaniam, Vice President, Product Marketing for Creative Professionals at Adobe, reminded us on stage that AI is here to “complement your skills, not replace them.” Just 24 hours later in the same space, during MAX’s second day on the keynote stage, Lara Balazs, Chief Marketing Officer and EVP of Global Marketing at Adobe, shared that “AI is here to be your partner” in the creative community.
So, where have all the cowboys gone? Maybe we’ve gone to MAX.
Howdy, partner. Let’s go conquer our inevitable wild.
