News Stories

ETC at the 2020 NAB Show

The 2020 NAB Show runs April 18 – 22 in Las Vegas, and ETC continues its participation with the annual convention.  This year Erik Weaver, ETC’s director of special projects, is producing 4 sessions across 3 days (April 20 – 22) for the Tomorrow’s Tech track.  Panels and presentations will include MovieLabs’ 2030 Vision, Virtual Beings, Using AI for content, VFX and API standards, and next generation media storage.

Use code EP06 for a FREE* Exhibits Pass or to save $100 off the NAB Show Conference.
*All free Exhibit Pass offers expire April 5, 2020
Register here.

Speakers added to the program (subject to change)

Monday, April 20

2030 Visions and Actions (Studios)
10:40am – 11:30am

MovieLabs laid out a 2030 Vision for the future of content creation and how emerging technologies will change our current processes.  With this panel we will explore how the 2030 Vision can be applied, not just to AAA movies, but productions of all budgets including episodic, reality, documentary and even local productions.  Our panelists will explain what the Vision means for broadcast, cable and OTT production in a cloud-based world.

Host: Mark Turner (Movielabs)


2030 Vision
11:20am – 12:00pm

Key industry players’ perspective and steps toward a 2030 vision.  This panel will break down the fundamental components from a key vendor/cloud perspective and allow them to give feedback on the 2030 vision, steps the community needs to align on and actions they are taking that better aligns their products toward the vision.

Tuesday, April 21

Virtual Beings
1:30pm – 1:50pm

This presentation will focus on the rapid evolution of virtual beings, including multi-modal and multi-disciplinary approaches to persistent characters who adapt and live outside a narrative. These trends will likely be the future of how we play, learn, connect and help each other.

Michael Koperwas, Mixed Reality Supervisor  (ILMxLAB)


Meet Your Digital Twin
1:50pm – 2:05pm

A look into the future of your digital likeness and the physical legacy that (you think) you are leaving behind.  With AI and ML, humans are becoming virtual and robots are becoming feeling humanoids.  We’re currently sitting in the gap years.  Imagine the near future where your digital twin has agency over your physical self.  Your twin has more “street cred” than your physical self.  Sending your twin to events because your physical self can’t make it will be socially OK.  Your likeness & legacy are fully captured, and the uniqueness that makes up you (your walk, your laugh, your dance moves) might all be used soon – with or without your consent – in the free marketplace.  This is a talk about AR/VR/MR/XR, spatial computing, IoT, the Metaverse, creativity, ownership and looking at your own self through the lens of volumetric captures, immersive media, digital DNA and the land rush to (re)create humans.

Kathleen Cohen, Experience Strategist (The Collaboratorium)


Bringing Characters to Life: Digital and Virtual Humans
2:05pm – 2:20pm

In this talk we explore what it takes to create digital and virtual humans and how they are used.  We will also discuss the potential impacts of this technology both on traditional film making as well as new media outlets such as YouTube or Instagram.  We will touch upon everything from volumetric capture to uses of AI to the uncanny valley and beyond.

John Canning, Executive Producer of New M&E (Digital Domain)

Virtual Beings Panel

2:20pm – 2:50pm

We will explore virtual beings and the Hollywood that will be, including how technology changes key talent and our presence in the digital world, as the gateway to the next generation of virtual beings evolves.

John Canning, Executive Producer of New M&E (Digital Domain)
Kathleen Cohen, Experience Strategist (The Collaboratorium)
Michael Koperwas, Mixed Reality Supervisor  (ILMxLAB)
Moderator: Phil Lelyveld, Director, Immersive Media Experience (Entertainment Technology Center@USC)

Using AI to Create a Machine Language for Content: Unveiling Vid2Vec

3:20pm – 4:00pm

The media industry’s current ability to extract rich metadata from unstructured audio and video content still lags far behind the tools available in the lab. For the past 12 months, the Entertainment Technology Center has been developing a breakthrough application, named Vid2Vec, which leverages cutting edge machine learning methods to extract dozens of new metadata attributes from video, and represent them semantically as a “language” both humans and machines can understand. Vid2Vec has potentially disruptive applications in many areas, including digital asset management, storage, and content recommendations.

Yves Bergquist, Director, AI & Neuroscience (Entertainment Technology Center@USC)

Standards for VFX & APIs

4:00pm – 4:40pm

Working groups talk through how VFX standards will evolve including how API’s will be defined moving forward.

Seth Levenson, Director, Adaptive Production (Entertainment Technology Center@USC)


Wednesday, April 22

Digital Storage for Next Generation Media
9:00am – 10:20am

Higher resolution, higher frame rate and higher bits per pixel as well as multi-camera projects are driving the growth of digital storage capacity and performance in media workflows. This session will explore the trends in video storage capacity, performance and latency requirements and look at how new storage architectures including NVMe-oF, persistent memories, storage accelerators and other technologies will enable 4K and even 8K media data centers in the near future.

Tom Coughlin, President (Coughlin Associates)

USC Student Panel Address Media Questions at ETC’s December All Members Meeting

A panel of six USC undergraduates from the Iovine & Young Academy and Dornsife College of LA&S spent an hour answering questions about their media habits asked by ETC member company executives at the December 12, 2019 All Members Meeting held in Burbank at Disney. Where do they get their recommendations? What’s a good length for a viewing experience? What do they think about having their personal data gathered, and about data analytics in general? What do they pay to subscribe to? Listen to this 9-minute highlight video to find out.


Spring 2020 Immersive Media Challenge Final Submissions, 4 Winners Picked

The Entertainment Technology Center at USC’s Spring 2020 semester Immersive Media challenge kicked off on January 29th for students and recent graduates from various disciplines. ETC’s Immersive Media Experience director Phil Lelyveld called upon submitters to ideate an immersive experience that, while not possible or easily built now, should be possible to build in 3-5 years.  New to this third ETC Immersive Media Challenge was the question; how will 5G impact the experience?

Students detailed what technological advancements need to take place in order to make their idea work and explain the reasoning behind their assumptions that those advancement will happen.

Participants were asked to answer 6 questions and create a 3-minute pitch video for the idea.

A panel of senior executives from the ETC member companies reviewed and ranked all of the submissions based on how well they accomplished the criteria of “a great idea, well told.”

The four winners are (in no particular order); Audio-Visual Animation by Jack Vomacka, Mind Palace by Sam Clempson, Affa Streaming AI Experience by Courtney Zhang, and MIC (ML for In-game Content) by Lance Newby.

Here are all of the Spring 2020 submissions.

  • Audio-Visual Animation by Jack Vomacka – Audiences of theme park attractions featuring animatronics must be kept at a sufficient distance so they do not get the chance to notice the concealed machinery. With Audio-Visual Animation, computer-generated characters will be free to move in a real-life space, no longer tethered to machinery. 6 AnswersPitch Video.
  • Telepresent Rooms by Laura Roed – Telepresent Rooms are a network of highly immersive 3D video viewing rooms, in which users can experience live events from miles away, engage in two-way 3D video calls, and share the moment across continents through the combination of LED Panel Screens, immersive audio, and gesture recognition. 6 AnswersPitch Video.

  • CleanEarth by Vivika Kapoor – This will be a game in which users can look at their phone to see their surroundings with added litter, pollution, etc through augmented reality, basically showing them what their surroundings would look like in the future if we, as a global community, don’t change our harmful habits. However, if the user performs certain tasks in the real world that help the environment, they earn in-game points and some trash or pollution will be removed from their app’s “world”. 6 AnswersPitch Video.
  • Mind Palace by Sam Clempson – Mind Palace brings memory-improvement tricks into the third dimension, by create rooms for placing memory cues in space and establishing links among the memories.  Mind Palace integrates a web browser, cloud file storage system, and more into visualized augmented reality. These spaces are interactive: objects in this augmented world are tangible and malleable layers in virtual space. 6 AnswersPitch Video.
  • Affa Streaming AI Experience by Courtney Zhang – Scroll No More, AI Affectiva Recommending Content Through Speech Emotion Detection. Affa uses a blend of personal data analytics, content metadata analysis, and emotional detection tools to make content recommendations based on your current mood. 6 AnswersPitch Video.

  • MIC (ML for In-game Content) by Lance Newby – MIC will make use of the latest in machine learning (ML) to turn your voice and normal conversations with NPCs into a natural, interactive and immersive tool for making decisions, and changing how the storyline plays out, in real-time, within the games we love and play. It turns Non-Playable Characters into minor characters who participate in and can influence the action. 6 AnswersPitch Video.

  • Legends – Meet Your Heroes by Sean Sani – My idea is to utilize advanced projector technology with some AR elements to put your heroes in front of you performing their greatest accomplishments. All in a lifelike setting where everything feels like the day it happened. Legends will put you in the room where it happens. 6 AnswersPitch Video.

Oops, something went wrong.

Industry Events

SXSW – CANCELED (Coronavirus)
March 13-22, 2020
Austin, TX

March 30 – April 2, 2020
Las Vegas, NV

NAB Show –
April 18-22, 2020
Las Vegas, NV

Digital Hollywood Spring –
May 19-21, 2020
Los Angeles, CA

Cine Gear Expo –
June 4-7, 2020
Hollywood, CA

July 19-23, 2020
Washington, DC

IBC Conference & Exhibition – 
September 11-15, 2020
RAI, Amsterdam, Netherlands

SMPTE Technical Conference – 
October 19-22, 2020
Los Angeles, CA