News Stories

ETC at the 2020 NAB Show

The 2020 NAB Show runs April 18 – 22 in Las Vegas, and ETC continues its participation with the annual convention.  This year Erik Weaver, ETC’s director of special projects, is producing 4 sessions across 3 days (April 20 – 22) for the Tomorrow’s Tech track.  Panels and presentations will include MovieLabs’ 2030 Vision, Virtual Beings, Using AI for content, VFX and API standards, and next generation media storage.

Use code EP06 for a FREE* Exhibits Pass or to save $100 off the NAB Show Conference.
*All free Exhibit Pass offers expire April 5, 2020
Register here.


Speakers added to the program (subject to change)

Monday, April 20

2030 Visions and Actions (Studios)
10:40am – 11:30am

MovieLabs laid out a 2030 Vision for the future of content creation and how emerging technologies will change our current processes.  With this panel we will explore how the 2030 Vision can be applied, not just to AAA movies, but productions of all budgets including episodic, reality, documentary and even local productions.  Our panelists will explain what the Vision means for broadcast, cable and OTT production in a cloud-based world.

Host: Mark Turner (Movielabs)

 

2030 Vision
11:20am – 12:00pm

Key industry players’ perspective and steps toward a 2030 vision.  This panel will break down the fundamental components from a key vendor/cloud perspective and allow them to give feedback on the 2030 vision, steps the community needs to align on and actions they are taking that better aligns their products toward the vision.


Tuesday, April 21

Virtual Beings
1:30pm – 1:50pm

This presentation will focus on the rapid evolution of virtual beings, including multi-modal and multi-disciplinary approaches to persistent characters who adapt and live outside a narrative. These trends will likely be the future of how we play, learn, connect and help each other.

Michael Koperwas, Mixed Reality Supervisor  (ILMxLAB)

 

Meet Your Digital Twin
1:50pm – 2:05pm

A look into the future of your digital likeness and the physical legacy that (you think) you are leaving behind.  With AI and ML, humans are becoming virtual and robots are becoming feeling humanoids.  We’re currently sitting in the gap years.  Imagine the near future where your digital twin has agency over your physical self.  Your twin has more “street cred” than your physical self.  Sending your twin to events because your physical self can’t make it will be socially OK.  Your likeness & legacy are fully captured, and the uniqueness that makes up you (your walk, your laugh, your dance moves) might all be used soon – with or without your consent – in the free marketplace.  This is a talk about AR/VR/MR/XR, spatial computing, IoT, the Metaverse, creativity, ownership and looking at your own self through the lens of volumetric captures, immersive media, digital DNA and the land rush to (re)create humans.

Kathleen Cohen, Experience Strategist (The Collaboratorium)

 

Bringing Characters to Life: Digital and Virtual Humans
2:05pm – 2:20pm

In this talk we explore what it takes to create digital and virtual humans and how they are used.  We will also discuss the potential impacts of this technology both on traditional film making as well as new media outlets such as YouTube or Instagram.  We will touch upon everything from volumetric capture to uses of AI to the uncanny valley and beyond.

John Canning, Executive Producer of New M&E (Digital Domain)


Virtual Beings Panel

2:20pm – 2:50pm

We will explore virtual beings and the Hollywood that will be, including how technology changes key talent and our presence in the digital world, as the gateway to the next generation of virtual beings evolves.

John Canning, Executive Producer of New M&E (Digital Domain)
Kathleen Cohen, Experience Strategist (The Collaboratorium)
Michael Koperwas, Mixed Reality Supervisor  (ILMxLAB)
Moderator: Phil Lelyveld, Director, Immersive Media Experience (Entertainment Technology Center@USC)


Using AI to Create a Machine Language for Content: Unveiling Vid2Vec

3:20pm – 4:00pm

The media industry’s current ability to extract rich metadata from unstructured audio and video content still lags far behind the tools available in the lab. For the past 12 months, the Entertainment Technology Center has been developing a breakthrough application, named Vid2Vec, which leverages cutting edge machine learning methods to extract dozens of new metadata attributes from video, and represent them semantically as a “language” both humans and machines can understand. Vid2Vec has potentially disruptive applications in many areas, including digital asset management, storage, and content recommendations.

Yves Bergquist, Director, AI & Neuroscience (Entertainment Technology Center@USC)


Standards for VFX & APIs

4:00pm – 4:40pm

Working groups talk through how VFX standards will evolve including how API’s will be defined moving forward.

Seth Levenson, Director, Adaptive Production (Entertainment Technology Center@USC)

 


Wednesday, April 22

Digital Storage for Next Generation Media
9:00am – 10:20am

Higher resolution, higher frame rate and higher bits per pixel as well as multi-camera projects are driving the growth of digital storage capacity and performance in media workflows. This session will explore the trends in video storage capacity, performance and latency requirements and look at how new storage architectures including NVMe-oF, persistent memories, storage accelerators and other technologies will enable 4K and even 8K media data centers in the near future.

Tom Coughlin, President (Coughlin Associates)

USC Student Panel Address Media Questions at ETC’s December All Members Meeting

A panel of six USC undergraduates from the Iovine & Young Academy and Dornsife College of LA&S spent an hour answering questions about their media habits asked by ETC member company executives at the December 12, 2019 All Members Meeting held in Burbank at Disney. Where do they get their recommendations? What’s a good length for a viewing experience? What do they think about having their personal data gathered, and about data analytics in general? What do they pay to subscribe to? Listen to this 9-minute highlight video to find out.

NEXT ARTICLES >

Spring 2020 Immersive Media Challenge has begun

Spring 2020 Immersive Media Challenge has begun

Think like a futurist!
  • Come up with a concept for an engaging experience that should be buildable in 3-5 years
  • Explain what needs to happen that will make building it possible
  • Articulate why your assumptions are reasonable
 Instructions and details are here.
 
The deadline for your brief first-pass answers to the six questions is midnight Wed., Feb. 5th.
Oops, something went wrong.

Industry Events

HPA Tech Retreat – www.hpaonline.com
February 17-21, 2020
Rancho Mirage, CA

SXSW www.sxsw.com
March 13-22, 2020
Austin, TX

CinemaCon www.cinemacon.com
March 30 – April 2, 2020
Las Vegas, NV

NAB Show – www.nabshow.com
April 18-22, 2020
Las Vegas, NV

Digital Hollywood Spring – www.nabshow.com
May 19-21, 2020
Los Angeles, CA

Cine Gear Expo – www.nabshow.com
June 4-7, 2020
Hollywood, CA

Siggraph www.s2020.siggraph.org
July 19-23, 2020
Washington, DC

IBC Conference & Exhibition – www.show.ibc.org 
September 11-15, 2020
RAI, Amsterdam, Netherlands

SMPTE Technical Conference – www.smpte.org 
October 19-22, 2020
Los Angeles, CA