News Stories

ETC Releases Three of Four Sections of “Fathead” Virtual Production White Paper

The Entertainment Technology Center@USC has released its case study entitled “Fathead: Virtual Production & Beyond.” Section 1 of the four-part white paper is “Cloud Computing: Growth Without Bounds” highlighting innovative work on the production done entirely in the cloud minus the shoot on set.  Section 2 of the white paper is “Sound Mitigation: Performance Matters,” which features compelling interviews with “Fathead” co-producer Brandyn Johnson and former Sony Pictures executive Eric Rigney. The section also addresses “the challenges of recording clean dialogue on LED volumetric stages and in-camera visual effects (ICVFX) during production.” Section 3 of the four-part white paper is “State of the Industry: Beyond Trends,” which discusses “where we’re at, and where we’re going” and features compelling interviews with thought leaders from companies including The Third Floor, Stargate Studios, Orbital Virtual Studios, Vū Technologies, Lux Machina, nDisplay, Epic Games and Unity Technologies.

Click here to access Section 1, “Cloud Computing: Growth Without Bounds,” here to access Section 2, “Sound Mitigation: Performance Matters,” and here to access Section 3, “State of the Industry: Beyond Trends”.  We’ll post an announcement when the remaining section becomes available.

vETC at Siggraph 2023: New Worlds, Adventures into Computer Vision Problems for Creating 3D Worlds

ETC is hosting its 8th vETC virtual conference at this year’s SIGGRAPH 2023 in Los Angeles, CA, August 8th, 9th and 10th.

vETC highlights significant presentations of emerging technologies and their impact on the M&E industry. This event’s speakers will explore how procedural generation machine, learning, and other tools affect, computer vision problems. We will explore nerves and other amazing new tools that help simplify building 3-D worlds along with many other cutting edge technologies. The sessions will be recorded and posted on ETC’s YouTube channel.

Support with key resources came from HP, NVIDIA, FUSE Technical Group, and ROE, with supporting companies Sony, Blackmagic Design, The Studio-B&H, Cintegral, Wacom, Cooke, SmallHD, Mo-Sys, ARRI, and AWS.

ROE is the LED wall partner. The system is designed and built by Fuse Technical Group with Mo-Sys for the tracking system. All additional supporting gear was donated by The Studio-B&H.

Led by Erik Weaver, director of ETC’s Adaptive and Virtual Production project, vETC (a virtual conference), is a concept implemented by ETC in 2015 to address the overflow of speakers and topics at conferences.

The 3-day program, along with speakers, and bios are below.

vETC at SIGGRAPH2023 -
Tuesday, Aug 8th (Day 1)

TimeSessionCompany(ies)Speakers
10:30 am - 11:00 am"Cuebric: Generative AI Comes To Hollywood" (Forbes)

Cuebric is putting the full-power of generative AI into the hands of the world's most visionary filmmakers and content creators. By rapidly constructing immersive 2.5D environments and optimizing the creative process, filmmakers, virtual production stages, art departments, production designers are now focusing on their vision and storytelling, leaving the have lifting to Cuebric's AI.
Cuebric
Pinar Seyhan Demirdag
11:00 am - 12:00 pmICVFX - Quickstart by Epic

Join us as Epic Games Instructors Sean Spitzer and Kevin Miller explore the tools and foundations for getting started with ICVFX over two sessions.

The first session took you through the process for transacting your projects to the LED Stage through nDisplay and Switchboard.
This second session will explore world building and optimization of your project along with possible camera and tracking solutions.
EpicKevin Miller

Sean Spitzer
12:00 pm - 12:30 pmSMPTE ST2110 and PTP in Virtual Production

The integration of ST2110 and PTP into the Unreal Engine is enhancing
the deployment of virtual production stages. The use of ST2110 to
communicate the inner view frustrum between render nodes reduces
the camera to display latency while the use of ST110 video essense
streams for capture and display removes the complexities of traditional
SDI/HDMI or Display Port.
NVIDIAThomas True
12:30 pm - 1:00 pmNeRF and disguisedisguiseCarin Mazaira
2:00 pm - 2:30 pmLuma AI

This talk will describe how Neural Radiance Fields work and what they can be used for, and how we are building tools at Luma to make this technology more generally accessible.
Luma AIMatt Tancik
2:30 pm - 3:00 pmCharacter Design - Using AI Tools to Design Your Character

Using low cost A.I. tools in a professional animation pipeline.
Unreal
Koina Freeman
3:00 pm - 3:30 pmProduction Transformed: Leveraging NeRFs, GANs, and Advanced AI Techniques

Across the production workflow there are opportunities to improve creative tasks that have high levels of repetition and are just plain boring. Creatives hate these tasks. By leveraging tech you can drop the constraints of render engines and linear production pipelines freeing the creative flow using cutting edge technologies like NeRFs, GANs, and other AI-powered techniques.
Modlabs
Alex Porter
3:30 pm - 4:00 pmNeRF: From the Lab to the Stage

This talk will introduce you to the awesome NeRF technology and their use for Media & Entertainment. We will give an overview on the current state of NeRFs, what are they capable of and what are their limitations. We will also look into the future exploring the most exciting ongoing research. Finally, this talk will give you the tools you need for start using NeRFs, how to capture them and where to use them, using live demos.
VolingaFernando Rivas-Manzaneque

Orlando Garcia
4:00 pm - 4:30 pmEnd-to-End Virtual Production

Virtual Production is a way. A method. It’s our current definition for the mesh between live action filmmaking & real-time technology. It’s not a stage. Not a volume. Not one company. Not one pipeline. It’s best realized when considered end-to-end from concept to Final Pixel.

For Siggraph only - see exclusive behind the scenes access to Final Pixel’s groundbreaking film for Oracle Red Bull Racing Formula 1, to understand why the future of Virtual Production - is Production.
Final PixelMichael McKenna
4:30 pm - 5:00 pmOvercoming the Challenges of Hand Capture in Previs, Production and Post

Hands are an essential part of body language and how we interact and engage with one another. Yet because of occlusion, labor intensive hand clean-up workflows or bulky and unreliable technology, hand capture has been a challenge traditional mocap technologies have struggled to address. Learn how content creators can use a new type of hand mocap technology in pre, production and post to create more expressive and authentic characters.
StretchSenseAnder Bergstrom
5:00 pm - 5:30 pmBrompton Technology PresentationBrompton TechnologyDaniel Warner
5:30 pm - 6:00 pmWe Promised Cake: Producing the Virtual Production Package

This talk will breakdown the details and actions, to execute an LED at a major conferece.
Cheese Toasty LLC

FUSE
Missy Pawneshing

Henrique ("Koby") Kobylko

vETC at SIGGRAPH2023 -
Wednesday, Aug 9th (Day 2)

TimeSessionCompany(s)Speakers
10:30 am - 11:00 amNeRFs for Virtual ProductionNVIDIAJason Schugardt
11:00 am - 12:00 pmICVFX - Quickstart by Epic

Join us as Epic Games Instructors Sean Spitzer and Kevin Miller explore the tools and foundations for getting started with ICVFX over two sessions.

The first session will take you through the process for transacting your projects to the LED Stage through nDisplay and Switchboard.

The second session will explore world building and optimization of your project along with possible camera and tracking solutions.
EpicKevin Miller

Sean Spitzer
12:00 pm - 1:00 pmColor & Color Rendering in an LED VolumeROETucker Downs
1:00 pm - 1:30 pmAnimation - Using AI Tools to Animate Your CharacterUnrealKoina Freeman
1:30 pm - 2:00 pmHow GANs Have Revolutionized Content Creation in Various Domains Such As Virtual Reality and VP

GANs have the ability to generate realistic content, textures, and characters, and have the potential for real-time rendering and procedural generation, making them valuable tools for enhancing the content creation process in virtual production and virtual reality workflows.
Technicolor,
The Mill
Mariana Acosta

Jocelyn Birsch
2:00 pm - 2:30 pmNVIDIA AI - Creation Through Imagination

Artists and producers are looking to accelerate content creation by employing the use of AI in their creative pipelines. From illustration to 3D animation, concept design to final frame, creators are finding advanced tools that accelerate their workflows and open up exciting possibilities. In this session you will learn about NVIDIA’s platform for visual computing and the ecosystem it enables. We will touch on the hardware and software at the core, then focus on the tools and innovations they power.
NVIDIARick Grandy
2:30 pm - 3:00 pmArcturusArcturusEwan Johnson
3:00 pm - 3:30 pmUnlocking Efficiency: Harnessing Distributed Processing with AI Systems

Asset generation has become a never-ending task, slowing deadlines, killing creativity and reducing the overall vision of the creative field. The evolution of AI-powered systems for distributed processing is making repetitive boring tasks a thing of the past and massively increasing speed and production capacity while maintaining quality.
ModlabsAlex Porter
3:30 pm - 4:00 pmVizrt - Interview with a Partner

Barbara Marshall and Mark Gederman in conversation about Virtual production, AI, and the intersection of the two.
Virzt

HP
Mark Gederman

Barbara Marshall
4:00 pm - 4:30 pmMocopi - Using Low-Cost Machine Learning to Visualize Animation Unreal,
Sony
Koina Freeman

Thaisa Yamamura
4:30 pm - 5:00 pmFoundry: An InterviewFoundryMatt Mazerolle (Foundry)
Dan Caffrey (Foundry)
Barbara Marshall (HP)
5:00 pm - 5:30 pmRebelle - The Painting Software Used In Spider-VerseEscape Motions

Wacom
Peter Blaskovic
5:30 pm - 6:00 pmNext Gen DIT Carts for Virtual ProductionWacom,
Cintegral
Arvind Arumbakkam

Dane Brehm

vETC at SIGGRAPH2023 -
Thursday, Aug 10th (Day 3)

TimeSessionCompany(s)Speakers
10:30 am - 11:00 amFuture of Animation Workflows & Women in Animation Dancing AtomsVani Balagam
11:00 am - 11:30 amTools for Mapping Virtual Worlds

Creative artists in various industries are all working to build virtual worlds for storytelling. What cutting-edge tools exist to help create these worlds faster, and with more artistic collaboration? Rob Sloan of Orbis Tabula will explain some tools and pipelines available today and in the very near future.
Orbis TabulaRob Sloan

11:30 am - 12:00 pmNeRF: From the Lab to the Stage

This talk will introduce you to the awesome NeRF technology and their use for Media & Entertainment. We will give an overview on the current state of NeRFs, what are they capable of and what are their limitations. We will also look into the future exploring the most exciting ongoing research. Finally, this talk will give you the tools you need for start using NeRFs, how to capture them and where to use them, using live demos.
VolingaFernando Rivas-Manzaneque

Orlando Avila-Garcia
12:00 pm - 12:30 pmA Talk

When HP approached the Mill with their idea for collaborating on a stunt, Tim Kafka had no idea what he was getting into. This presentation will cover the creation The HP Z8 Fury Drop project and the technology that made it possible.
The MillTim Kafka
12:30 pm - 1:00 pmBeyond the Markers: Invisible - Markerless Motion Capture

Join disguise and Move.AI as we dive into the innovative applications and groundbreaking technology behind 'Invisible', a new markerless motion capture solution. Discover how this cutting-edge solution frees performers from cumbersome markers, using advanced computer vision and AI algorithms to capture truly authentic human movement.
disguise

Move.ai

Ely Stacy

Niall Hendry
1:00 pm - 1:30 pmEmotional Intelligence and Sentience in AI Avatars: The Future of Digital Interaction

Discover the breakthrough in human computer interfaces! Learn about the technology that enables these AI avatars to express emotions, and the challenges in creating believable emotional responses, and the solutions developed. Understand how emotion synthesis enhances digital communications. Identify the potential in current and future applications, as they transform myriad industries, while generating trillions in revenues.
Alejandro Franceschi
2:00 pm - 2:30 pmAI for Creatives

Artists and producers are looking to accelerate content creation by employing the use of AI in their creative pipelines. From illustration to 3D animation, concept design to final frame, creators are finding advanced tools that accelerate their workflows and open up exciting possibilities. In this session you will learn about NVIDIA’s platform for visual computing and the ecosystem it enables. We will touch on the hardware and software at the core, then focus on the tools and innovations they power.
c. Craig Patterson
2:30 pm - 3:00 pmWorld Building - Using AI Tools to Build Your Scene

As an Epic Games; Animatioand Virtual Production Fellow Koina hopes to leverage new technologies and filmmaking to engage the future of storytelling.
Unreal
Koina Freeman
3:00 pm - 4:00 pmThe Art of Quality Control: Ensuring Excellence in Real-Time Content (Panel)

Mundane tasks like optimization, continuity and consistency of content which were once relegated to the few capable people can now be automated and delegated to AI computer systems. This allows for a more creative and iterative approach to design and development ensuring the highest quality and fastest delivery available.
ModlabsAlex Porter

The SPEAKERS - vETC at Siggraph2023

SpeakerBio


Mariana Acosta
SVP, Global Virtual Production and On-Set Services
Technicolor
Originally from Mexico City, Mariana has called Los Angeles home for 14 years. A technologist, connector, entrepreneur, veteran pixel pusher, Variety magazine’s top 10 innovators, a finalist on the Women Startup Challenge in AI, and a software creative and business development leader.

During her years working at Foundry, she focused on VFX & XR post-production workflows and was the Head of Product Specialists. Most recently she co-founded a Technology company focused on developing tools for Virtual Production. Before her switch to tech, she worked as an on-set VFX supervisor & senior compositor and has over 16 years of experience in the motion picture industry. She is now the SVP of Global VP and On-Set services for Technicolor Creative Studios.

Arvind Arumbakkam
Creative Technologist
Wacom
Arvind is a creative technologist at Wacom focusing on emerging Media & Entertainment workflows. Arvind and teamwork with innovators in the industry to push the boundaries of creative tools – with a focus on harmonizing the intersection of technology and artist experience.

Orlando Avila-Garcia
Head of AI
ARQUIMEA Research Center
Dr. Orlando Avila García is the head of AI at Arquimea Research Center and co-founder at Volinga. He obtained his PhD at University of Hertfordshire, UK. With over 15 years of experience, he has contributed to numerous research and innovation projects across diverse domains, including banking systems engineering, online advertising, online fraud prevention, customer service, and robotics. In 2020, Dr. Avila García assumed leadership of the AI research team at Arquimea Research Center, where his primary focus lies in the cutting-edge field of Neural Radiance Fields (NeRF). He co-founded Volinga AI in 2023, aiming to bring the NeRF research to the market and drive its practical applications.

Saraswathi "Vani" Balgam Buyyala
CEO
Dancing Atoms
Vani's passion for filmmaking and animation began at age 8 when her father set up an animation studio showcasing Indian stories to the world in 1985. Today, she carries on her father’s artistic vision through her studio, Dancing Atoms LLC. She is a self-taught filmmaker who started working in Bollywood films at just 16.

Ander Bergstrom
Business Development Manager
StretchSense
Ander is a seasoned veteran of the games and feature film industry with over 20 years of experience in motion capture, VR, and animation. He has worked on numerous AAA games, including Call of Duty, Grand Theft Auto, The Elder Scrolls, and animated feature films such as Happy Feet and Moana. He also owned and operated a motion capture studio in Seattle for six years.

Ander is passionate about using technology to create realistic and immersive experiences. He is particularly interested in the challenge of motion-capturing hands, which are complex and expressive systems. He is now the Head of Sales for StretchSense, a leading provider of hand-tracking technology.

In addition to his technical skills, Ander is also a gifted storyteller. He has a keen eye for detail and a strong understanding of human movement. This makes him a valuable asset to any team working on a motion capture project.


Peter Blaskovic
Founder & CEO
Escape Motions
Peter Blaskovic is the founder and CEO of Escape Motions. He got his master's degree at the Faculty of Architecture, STU in Bratislava. His long-standing passion—connecting art with the world of code—started in the mid-90s on 16-bit machines when Peter was coding visual demos for Amiga computers. In 2003 Peter joined Cauldron, a game development company, where he worked as a lead animator for Activision game projects. His website with interactive artistic applets attracted millions of people and motivated him to establish his studio Escape Motions in 2012. The company and the software they developed got numerous renowned awards and nominations and are used by acclaimed studios including Sony Pictures Imageworks, Walt Disney, ILM, or Lucas Film.

Dane Brehm
Production Technologist and DIT
Cintegral Technologies
Dane Brehm, a Production Technologist & Digital Imaging Technician has a long standing desire to stay at the forefront of creativity, technology, and workflow. His direct approach, sharing personality, and consideration for everyone on-set and in Post allows for a faster incorporation of new workflow techniques and solutions.

Dane balances his time between working on-set as a ICG DIT for ASC DPs; while running Cintegral.tech, a Production Technology + Workflow company that focuses on introducing and implementing high-performance enterprise NVMe SSD, meta-data capture, and RAID systems. Cintegral has supported over 120 productions since 2016 to meet modern 4K/6K/8K Studio and Streaming content demands for large format-multi-camera-VFX intensive cinematography. He has forged deep partnerships with OEMs to develop tools and software for the M&E space.

Dane is an ASC Technical Instructor, SMPTE and Local 600 member, as well as a participant in the ASC-MITC Advanced Data Management working group. He is a founding member of the DIT super group “DIT-WIT,” which comprises over 300 global members to discuss, share, develop, and execute the creative and technical needs of present and future productions.

Pinar Seyhan Demirdag
Co-Founder, AI Director
Cuebric



Pinar Seyhan Demirdag is an A.I. director, multidisciplinary creator, visionary, and an outspoken advocate for the conscious use of technology.

Pinar and her partner Gary Koepke co-founded Seyhan Lee, the world's first studio devoted to generative A.I., in 2020. The firm has become the hub for the entertainment industry’s smooth transition into generative A.I.. Seyhan Lee develops human-guided practical and thought-provoking creative A.I. solutions for the film, broadcast, and entertainment industries. At the core of the company's mission lies in developing and perfecting conscious use of technology for the immersive new world, and it's a mission of all of their projects to expand mindfulness and possibilities for the meaningful elation of humanity through technology.

Tucker Downs
Color Scientist
Roe Visual
Tucker Downs is a Color Scientist and Research and Development Manager for ROE Visual. He’s been involved in multiple generations of LED technology ranging from multi-primary theatrical lighting and now the first generation of multi-primary LED video products coming to the market. Previously he worked for Megapixel VR and VER to develop color pipelines and calibration technology for LED processing. At ROE, his focus is on guiding the company and the industry towards a state of high-fidelity video for lighting purposes, in camera visual effects, and the viewer experience.


Michal Fapso
Lead Developer
Escape Motions
Michal Fapso is the lead developer at Escape Motions. He earned his Ph.D. in speech processing from Brno University of Technology while working with the Speech@FIT research group. Michal is passionate about C++, Linux, debugging, forest protection, running, and spending his free time with his wife, Hanka, and their three sons.

Alejandro Franceschi
Producer, Director, Technologist


Alejandro Franceschi is an Emmy awarded producer, director, and technologist with two decades of experience in diverse fields such as startups, education, government, TV, studio features, and Fortune 500 tech companies. As a Lumiere awarded storyteller, Alejandro thrives in the realm of cutting-edge technologies like AR and VR, demonstrating expertise in fostering creative partnerships with agencies, vendors, and project management while leading and motivating teams. They have established themselves as a trusted representative for brands, products, design, and vision, delivering high-quality narrative content that achieves measurable business goals promptly and within budget, in charming and unexpected ways. For a comprehensive work history, please visit their LinkedIn profile.

Koina Freeman
Unreal Authorized Training Partner
After her first film premiered at the Sundance Film Festival, Koina spent several years as a director for FOX America’s Most Wanted. Later, she sold a TV movie to Fox, and joined the faculty at Laney College where she founded the XR Lab, and currently serves as the Virtual Production Faculty Lead.

Koina holds a Masters in Technical Design for Theater, has won a few film and design awards, apprenticed with Disney’s Imagineering Live Entertainment, and has mentored Tribeca filmmakers for Epic Games on how to use the Unreal Engine for Pre-vis, and Virtual Production.

When she is not teaching Unreal to filmmakers, or leading the A.I. in Virtual Production working group for the Entertainment Technology Center, Koina consults industry stakeholders on using A.I. tools to support creative pipelines.

As an Epic Games, Animation and Virtual Production Fellow Koina hopes to leverage new technologies and filmmaking to engage the future of storytelling.

Rick Grandy
Principal Solutions Architect
NVIDIA
Rick Grandy is a Principal Solutions Architect at NVIDIA focused on graphics and machine learning for the Gaming, Advertising and Media & Entertainment industries. Rick has nearly 3 decades of experience in visual effects and animation with industry leaders Industrial Light and Magic, Sony Pictures Imageworks, Digital Domain, Bad Robot, and others. His work initially centered on the development of digital characters and animation tools. Over time it grew to cover the VFX pipeline end-to-end including real-time previsualization, asset management, post-production, and ingestion/delivery systems. At NVIDIA, Rick has spent the past 5 years combining his experience with the latest in production technology and advanced research to help our customers build the solutions necessary to perform magic on the screen.

Niall Hendry
VP, Partnership
Move.ai
Niall Hendry is VP of Partnerships at Move.ai. Niall is a specialist in deploying & productising AI-powered real-time and post production technologies for Media, Entertainment & Sport.

Tim Kafka
Head, Creative Operations
The Mill
Tim Kafka is the Head of Creative Operations for the Mill in LA. In this role, he helps guide the Mill’s long term strategies for workflows and technology. He has a passion for improving the tools and methods that VFX Artists use to enable them to push the limits of what’s possible on an advertising timeline. And his goal is to give artists the technology they need to do their best work while maintaining their quality of life.

After graduating from USC, Tim Kafka started his post production career working in music and sound and held a variety of industry roles before finding a home at MPC for 11 years. While at MPC, he discovered his passion for CG and developed a reputation for photoreal lookdev and lighting. As a CG Supervisor, he worked with many notable directors and brands and is a VES award nominee. He served as MPC’s Head of 3D before transitioning to the Mill in 2022.

Henrique "Koby" Kobylko
Director, OSVP
Fuse Technical Group

Henrique Kobylko, also known as Koby, is the Director of On-Set Virtual Production for Fuse Technical Group. With a deep passion for virtual production, Koby specializes in developing efficient on-set workflows utilizing cutting-edge technology and digital tools to bring other creators' visions to life. His expertise in in-camera VFX and LED volumes makes him a valuable asset in guiding and assisting other creators. Koby recognizes the importance of a dedicated team in creating great products and strives to contribute to the growth and advancement of the art of virtual production in the film industry.

Michael McKenna
CEO
Final Pixel
As CEO at Final Pixel, Michael holds the vision of how Virtual Production will revolutionise filmmaking. On productions he leads as Director of Virtual Production, overseeing the VAD and OSVP for Film, TV and Advertising clients. He is a leader, an innovator and a disruptor, operating at a unique intersection in technology, creativity and business to produce world class content.

Kevin Miller
Instructor / Tech Artist
Epic
Kevin Miller is an Epic Games Instructor with over two decades of production experience in the media & entertainment, creative development, graphic design, and education fields.
Kevin is primarily responsible for the Unreal Engine client training offerings in Sequencer and shares responsibility for the ICVFX, Animation, Rendering, and Broadcast curriculum with several other Epic Games Instructors.

c. Craig Patterson
Writer and Filmmaker
c. Craig is a filmmaker whose work has received distinctions from The Blacklist, Sundance Screenwriters Lab, Sundance Episodic Lab, Hillman Grad’s Rising Voices, the Nicholl Fellowship, and the Emerging Filmmaker Showcase at Cannes Film Festival’s American Pavilion. His latest film, FATHEAD, is the recipient of the Innovation in Technology grant from the University of Southern California (USC), and an NAACP Image Award nomination for Outstanding Short Form (Live-Action). c. Craig has also produced projects for Carnegie Hall, directed Paramount+’s critically acclaimed one-hour comedy special for Roy Wood Jr, ‘Imperfect Messenger,’ and is a past recipient of the George Lucas Family Scholarship.

Alex Porter
CEO
Mod Tech Labs
Alex Porter is the CEO of Mod Tech Labs, an AI-powered automated software making it easy to playback high quality content faster. Starting with a background in physical and digital design while earning a degree in Interior Design and Construction Technology, her focus over the last decade has been on powering visual experiences using emerging technologies— including AI, XR, IoT, and beyond. She is a Forbes NEXT 1000 Honoree, an Intel Top Innovator, and part of NVIDIA Inception. She is an advocate for technologists and continues to work with groups including CTA, Real-time Society, SMPTE, and others to shape the industry and educate on the benefits of technology from mentorship to standards.

Tim Porter
CTO
Mod Tech Labs
Timothy Porter is CTO of Mod Tech Labs leading the company in creating groundbreaking AI-powered technology to usher in the future of digital content. After graduating from Full Sail University, Tim began his career as an animation artist and, over two decades, expanded into more highly technical developer roles at leading game and production studios. He worked as a Pipeline Technical Director at Sony Picture Imageworks and a Technical Artist at Two Bit Circus, Gameloft, and GameHouse. Using his deep knowledge of technology and entertainment paradigms to automate the slow technical tasks in production.He has been recognized for Technology Leadership, Innovation and more by organizations including CTA, Intel, and NVIDIA as well as leading the way for institutions, lawmakers and technologists to help standardize technology practices across industries and applications.

Fernando Rivas-Manzaneque
Co-Founder
Volinga
Fernando Rivas-Manzaneque, a co-founder of Volinga, has been working for 3 years in the intersection between neural rendering and filmmaking. With a Bachelor's and Master's degree in electrical engineering, he has dedicated the the last years to conducting extensive research on Neural Radiance Fields (NeRF) at Arquimea Research Center. His contributions have been recognized by renowned conferences such as CVPR, where he has published his work.

In addition to his research endeavors, Fernando is currently pursuing a PhD, further solidifying his expertise in the field. Recently, he established Volinga.ai with the vision of applying the outcomes of his research to the media and entertainment industry, ensuring that his findings have a tangible impact on the field. At Volinga, he led the developments for Unreal Engine and RenderStream plugins for NeRF, being the first company to bring NeRF into Virtual Production.

Jason Shugart
Solutions Architect
NVIDIA



Jason Schugardt is a Solutions Architect for NVIDIA helping customers connect to the best solutions at NVIDIA. With over 25 years of experience in the Visual Effects industry he has worked on numerous blockbuster films, including the Academy Award-winning The Curious Case of Benjamin Button and Disney's The Sorcerer's Apprentice, Peter Jackson's King Kong, Sam Raimi's Spider-Man 2, and Steven Spielberg's Minority Report, and many others. Driven by his passion for exploring cutting-edge technology, he's led innovative VR and AR projects, pushing the boundaries of technology in story telling. At NVIDIA, Jason enjoys leveraging his deep expertise to help clients harness the power of generative AI and Neural Radiance Fields (NeRFs).


Rob Sloan
CEO / CTO
Orbis Tabula
Rob Sloan is the CEO of Orbis Tabula, a business and technical R&D consultancy. At Orbis, Rob works with clients to render physical and imagined locations as adaptable virtual worlds for Media & Entertainment projects. Rob also has over a decade of experience teaching advanced technical filmmaking concepts as a Graduate Professor at Full Sail University. As both an established filmmaker and lecturer, Rob has evangelized innovative technologies like digital cinema workflows, ICVFX Virtual Production, and Generative AI for many years.

Sean Spitzer
Sr Instructor / Master Mentor Artist for VP
Epic
Sean Spitzer is one of Unreal Enterprise's Senior Instructor/ Master Mentor Artists for VP at Epic Games. Sean has been in the games and media industry for 26 years, working in both console, PC and social gaming verticals as well as commercial TV. He has an extensive background in education teaching Arts and Animation and Game Art and Design related topics at both the Academy of Art University and The Art Institute of California. He also not too long ago finished working on Netflix's Love Death and Robots Season 3’s short, Vaulted Halls:Entombed helping on lighting and shader work.

Ely Stacy
Technical Solutions Lead
disguise
Ely Stacy is a skilled member of disguise's Technical Solutions team for nearly three years, focusing on Virtual Production and XR stages integration, operator training, and overcoming complex challenges in tech solutions. With the rapidly evolving technology and entertainment landscape, Ely's excitement lies in pushing boundaries to deliver bigger and better experiences.

Matt Tancik
PhD Graduate
Luma AI


Matt Tancik is PhD graduate from UC Berkeley where he was advised by Ren Ng and Angjoo Kanazawa. He received his bachelor’s degree in CS and physics at MIT. He received a master’s degree on imaging under the advisement of Ramesh Raskar at MIT. His research lies at the intersection of machine learning and graphics. He helped develop Neural Radiance Fields and is now working at Luma AI with the goal of making these technologies widely accessible.

Thomas True
Sr. Applied Engineer
NVIDIA
Thomas True is a Senior Applied Engineer for Professional Video and Image Processing in NVIDIA’s Enterprise Solutions Group. In this role he works at the intersection of video I/O and the GPU where he focuses on the integration of GPU and networking technologies into broadcast, video and film applications ranging from pre-visualization to post production and live to air. Prior to joining NVIDIA, Tom was an Applications Engineer at SGI. Thomas has a M.S. degree in Computer Science from the Graphics Lab at Brown University and a B.S. Degree from the Rochester Institute of Technology.


Erik Weaver
Director, Adaptive & Virtual Production
ETC
Erik Weaver is a specialist focused on the intersection of cloud and the Media and Entertainment industry. He currently directs ETC’s Adaptive and Virtual Production project that encompasses many aspects of the cloud, including transport, security, metadata, long-term storage, and formation of an agnostic framework that unites key vendors and studios.

Weaver has fostered and achieved many accomplishments in the M&E market including executive producer on several ETC R&D short film projects: "The Suitcase," which utilized HDR, cloud-based workflows, and live 360, debuted at the 2017 Tribeca Film Festival and earned the CAA People's Choice Award; the Emmy-nominated, "Wonder Buffalo", which employed volumetric capture, photogrammetry, ambonisic sound and interactivity, and screened in competition at SXSW Interactive Film Festival; “Ripple Effect" and its white paper that focused on ICVFX and was highlighted by the VFX society; and most recently, “Fathead”, nominated for an NAACP Image award.

Weaver drives the ICVFX programming for Infinity Festival, co-chairs the SMPTE RIS on in-camera VFX, serves on the ASC Technology Council and is co-chair of the ETC Virtual Production Working Group. He previously chaired NAB’s “Next Generation Media Technologies” Conference 2014-2017, as well as chaired and initiated vNAB 2015-2019 (later rebranded as the ongoing vETC).
< PREVIOUS ARTICLES NEXT ARTICLES >

Specification for Naming VFX Image Sequences Released

ETC’s VFX Working Group has published a specification for best practices naming image sequences such as plates and comps. File naming is an essential tool for organizing the multitude of frames that are inputs and outputs from the VFX process. Prior to the publication of this specification, each organization had its own naming scheme, requiring custom processes for each partner, which often resulted in confusion and miscommunication.

The new ETC@USC specification focuses primarily on sequences of individual images. The initial use case was VFX plates, typically delivered as OpenEXR or DPX files. However, the team soon realized that the same naming conventions can apply to virtually any image sequence. Consequently, the specification was written to handle a wide array of assets and use cases.

To ensure all requirements are represented, the working group included over 2 dozen participants representing studios, VFX houses, tool creators, creatives and others.  The ETC@USC also worked closely with MovieLabs to ensure that the specification could be integrated as part of their 2030 Vision.

A key design criteria for this specification is compatibility with existing practices.  Chair of the VFX working group, Horst Sarubin of Universal Pictures, said: “Our studio is committed to being at the forefront of designing best industry practices to modernize and simplify workflows, and we believe this white paper succeeded in building a new foundation for tools to transfer files in the most efficient manner.”

This specification is compatible with other initiatives such as the Visual Effects Society (VES) Transfer Specifications. “We wanted to make it as seamless as possible for everyone to adopt this specification,” said working group co-chair and ETC@USC’s Erik Weaver. “To ensure all perspectives were represented we created a team of industry experts familiar with the handling of these materials and collaborated with a number of industry groups.”

“Collaboration between MovieLabs and important industry groups like the ETC is critical to implementing the 2030 Vision,” said Craig Seidel, SVP of MovieLabs. “This specification is a key step in defining the foundations for better software-defined workflows. We look forward to continued partnership with the ETC on implementing other critical elements of the 2030 Vision.”

The specification is available online for anyone to use.

Oops, something went wrong.