News Stories

ETC’s Synthetic Media Summit Charts AI Playbook

ETC@USC hosted its second Synthetic Media Summit on March 6, convening media and technology leaders to identify AI applications and chart a detailed playbook for an AI-augmented future for human creativity. ETC partnered with Carnegie Mellon University, United Talent Agency, and Machine Cinema to deliver this timely, compelling event. While the Summit was closed to the press and presented off the record, we had an opportunity to speak with Yves Bergquist, director of AI in Media at ETC and the event’s organizer, about what’s next for artificial intelligence.

Speakers represented a wide range of companies including Adobe, BBC, Carnegie Mellon, Coca-Cola, Disney, Dolby, FOX, Google, LEGO, Mattel, Meta, Moonvalley, OpenAI, Unilever, Universal, UTA, and WPP.

ETC: What can you tell us about the need for ETC’s second Synthetic Media Summit? What were your goals for the event?

Yves: We’re three years after our initial SMS, which happened in the very beginning of the GenAI “irruption” into media, so our goal was to have a very pragmatic and in-depth conversation – what’s working, what’s not working, use cases, cases studies, examples, etc. We need to move the conversation beyond opinions and predictions into figuring out an actual hands-on “playbook” for the tech in media. In the past few years, many industries, including our own, have been very reactive to the high velocity of model releases and capabilities. This event is about taking our future back. It’s about laying out a vision for what we want the future to be, and laying out a very concrete playbook to get there.

ETC: The Summit featured participation from a wide range of companies and organizations. What was the criteria of your speaker selection? Who was invited to attend?

Yves: We look for intellectual, cultural, and organizational diversity for sure. But I like to put together two categories of speakers on the same panel: on one side, the kind of very senior industry leaders we work with on a daily basis at ETC, and on the other, people – often creatives – who may not have the same seniority, but are doing extremely aggressive and forward-looking experimentation. I find that bringing these two worlds together (the 40,000-foot view and the 1-foot view) makes for the very best conversation. Senior executives need to be guarded in their views, which we understand. But putting them in a direct conversation with very hands-on practitioners, which they actually love to do, brings them out of their shell and makes for a fascinating exchange. We’re lucky at ETC to be at the crossroads of virtually every meaningful conversation or experiment with AI in the industry, and we had great partners such as Machine Cinema and UTA to connect us with even more people, so we had the greatest lineup we ever had for any such event.

ETC: Did you identify any key takeaways from this year’s event that may be worth sharing with a larger audience?

Yves: We’re going to put out a white paper that summarizes the conversations into a concrete “AI playbook” for the media industry, but I can lift the veil on two things that struck me across the board. One, GenAI, especially in animation, unleashes unprecedented velocity, which – combined with AI focused on “sensing” audiences and market opportunities in near-real time – would allow the industry to truly “move at the speed of culture,” as Secret Level CEO Eric Shamlin says. The second is that, far from job displacement, there’s considerable growth in job opportunities for traditional creatives with a deep practice of AI, which virtually anybody can get. Together, these two takeaways gave the whole day a positive energy. It’s clear there’s growing excitement in our community.

ETC: Did the speakers and attendees identify any potential applications for an AI-augmented future for media, entertainment, storytelling, or other creative endeavors?

Yves: I can’t get into much detail here, given that the conversations were off the record, but overall I think we’re about to see processes that take weeks take minutes, and that compression in time is going to unleash a creative energy that we’ve never seen in the industry. The future is going to be insanely creative, so if you love the art of what we do the next 50 years are going to be very, very exciting.

ETC: Did the Summit identify any areas of concern when it comes to AI?

Yves: Surprisingly, we didn’t hear much about areas of concern, which if course doesn’t mean they don’t exist. But with such a hard focus on implementation, what we heard was a lot of examples of bottlenecks such as the dearth of professionals with deep VFX skills and deep AI skills, organizations being too slow to change, etc. We didn’t instruct anybody to “stay positive,” but we curated the event around the “here and now,” and there’s already a ton of very useful research and commentary about AI ethics. We didn’t feel like we wanted to be redundant.

ETC: Did the Summit identify any emerging trends that are most relevant to M&E businesses?

Yves: So again, here it’s hard to be specific, considering the confidential nature of conversations, but media veteran Doug Shapiro opened the conference with a very sobering view of the media market, and how much linear media has declined as a share of people’s day. And this was the day’s focus, really: How do we win the war for attention? And I have to say, after hearing about what is going on right now in many corners of the industry, I’m definitely bullish about our future. I think if we can all be humble in realizing that we need to learn the new rules of the industry, and slay certain sacred cows, we can create massive economic and job opportunities, including and especially in the LA area, which everyone feels very strongly about.

ETC: As ETC’s director of AI in Media, what can you tell us about the organization’s ongoing analysis of artificial intelligence?

Yves: AI has gone from just one project to essentially seeping into every area of our work. Erik Weaver’s industry-leading work with ETC on virtual and generative production, for example, is now 99 percent focused on AI methods and tools. This wasn’t the case two years ago. And we have many new initiatives up our sleeve, including an “agent studio” where the industry can experiment with AI agents, which are emerging as the new “application layer” for AI.

ETC: Does ETC have plans for another Synthetic Media Summit? What’s next for your AI interest groups?

Yves: Organizing the Summit has taken immense effort over the past five months, so we’re focused on scraping ourselves off the floor before thinking about the next one. However, the event was such a success – with more attendance than the last one – and we have received so much positive feedback, that it’s clear we’re going to have another one. We just don’t know when. In the meantime, ETC plans to ramp up the velocity of its virtual, member-only “AI Roundtables,” which are “mini-SMS” events (one hour, off the record).

This Week in Artificial Intelligence: 4/2/26

Filmmaker Coerte Voorhees and the estate of Val Kilmer are bringing the popular actor back to cinema with generative AI. ‘South Park’ creators reveal plans for their boutique firm Deep Voodoo and why they believe AI could help launch a new era of creativity in Hollywood. And as AI continues to ramp up across most industries, Johns Hopkins University historians Angus Burgin and Louis Hyman reflect on what we’ve learned from earlier tech revolutions. These are a few of the notable stories that recently caught our attention.

Val Kilmer Resurrected by AI to Star in ‘As Deep as the Grave’ Movie: First Look
Variety

Is Trey Parker and Matt Stone’s Deep Voodoo the Rare Company Doing AI Right?
The Hollywood Reporter

Q&A: What History Can Teach Us About AI
Johns Hopkins University

How Disney Imagineers Are Using AI and Robotics to Reshape the Company’s Theme Parks
Fast Company

Hollywood Reframes AI as Infrastructure, Not Replacement
Axios

Kathleen Kennedy Just Told an AI Conference She’s Not So Sure About AI
The Hollywood Reporter

This Is What Honest AI Conversations Sound Like in Hollywood
IndieWire

CNN Veteran Laurie Segall Wants to Take a ‘Mostly Human’ Approach to Covering AI and Big Tech
The Hollywood Reporter

Artificial Intelligencer: OpenAI’s $852 Billion Problem: Finding Focus
Reuters

I Wrote a Novel Using AI. Writers Must Accept Artificial Intelligence – But We Are as Valuable as Ever
The Guardian

AI Could Change the World. But First It Is Changing Silicon Valley.
The New York Times

AI Companies Shatter Fund-Raising Records, as Boom Accelerates
The New York Times

AI Models Lie, Cheat, and Steal to Protect Other Models from Being Deleted
Wired (subscription required)

NEXT ARTICLES >

Specification for Naming VFX Image Sequences Released

ETC’s VFX Working Group has published a specification for best practices naming image sequences such as plates and comps. File naming is an essential tool for organizing the multitude of frames that are inputs and outputs from the VFX process. Prior to the publication of this specification, each organization had its own naming scheme, requiring custom processes for each partner, which often resulted in confusion and miscommunication.

The new ETC@USC specification focuses primarily on sequences of individual images. The initial use case was VFX plates, typically delivered as OpenEXR or DPX files. However, the team soon realized that the same naming conventions can apply to virtually any image sequence. Consequently, the specification was written to handle a wide array of assets and use cases.

To ensure all requirements are represented, the working group included over 2 dozen participants representing studios, VFX houses, tool creators, creatives and others.  The ETC@USC also worked closely with MovieLabs to ensure that the specification could be integrated as part of their 2030 Vision.

A key design criteria for this specification is compatibility with existing practices.  Chair of the VFX working group, Horst Sarubin of Universal Pictures, said: “Our studio is committed to being at the forefront of designing best industry practices to modernize and simplify workflows, and we believe this white paper succeeded in building a new foundation for tools to transfer files in the most efficient manner.”

This specification is compatible with other initiatives such as the Visual Effects Society (VES) Transfer Specifications. “We wanted to make it as seamless as possible for everyone to adopt this specification,” said working group co-chair and ETC@USC’s Erik Weaver. “To ensure all perspectives were represented we created a team of industry experts familiar with the handling of these materials and collaborated with a number of industry groups.”

“Collaboration between MovieLabs and important industry groups like the ETC is critical to implementing the 2030 Vision,” said Craig Seidel, SVP of MovieLabs. “This specification is a key step in defining the foundations for better software-defined workflows. We look forward to continued partnership with the ETC on implementing other critical elements of the 2030 Vision.”

The specification is available online for anyone to use.

Oops, something went wrong.