ETC will host its 6th vETC virtual conference at this year’s Siggraph 2022 in Vancouver, BC, Canada, August 9th, 10th and 11th.
vETC highlights significant presentations of emerging technologies and their impact on the M&E industry. This year, speakers will present concepts, workflows, business models, case studies, etc., as it applies to virtual production and the concept of “fix it in pre”. Many of the companies that worked on ETC’s latest R&D film project, “Fathead” will discuss their discoveries in sessions that are recorded and posted on ETC’s YouTube channel.
The LED wall will be a 1.9 Planar powered by SilverDraft Supercomputing. The system is built and operated by Voluminous.com with Blackmagic and ARRI, with a 12k Ursa camera from Blackmagic and tracking by OptiTrack. All additional supporting gear was donated by B&H and MBS. Onset lighting assisted by Orbital and Arri and local resources supported by AnnexPro.
Scheduled to present (subject to change) are Sony Pictures Entertainment, Disguise, Planar, NEP Virtual Studios, ICVR, PacketFabric, Technicolor, AWS, Unity, 5thKind, Netflix, Arri, Voluminous, SMPTE, Foundry, Lux Machina Consulting, Intel, Perforce and Move.ai.
Led by Erik Weaver, director of ETC’s Adaptive and Virtual Production project, this vETC is a restart to the virtual conference, a concept implemented by ETC in 2015 to address the overflow of speakers and topics atNAB. The last program before the pandemic, “vETC | The Grand Convergence 2019: Innovation and Integration” recorded across two days at Technicolor and Google (August 27 and 28), is available online.
The 3-day program, along with speakers, bios and gear are below.
vETC at SIGGRAPH2022 - Tuesday, Aug 9th (Day 1)
|10:00 am - 10:30 am||Reducing Complexity on Set|
Powering a VP stage with Disguise simplifies technical challenges and puts control back into the hands of storytellers.
An overview of Disguise ecosystem and real world use cases.
|10:30 am - 11:00 am||How to Push the Capabilities of Filming on an LED Wall|
In choosing a virtual production solution for filming, there are key factors for success. We'll discuss the advantages & challenges to shooting with a LED wall. At Orbital Virtual Studios, we're expanding what's possible on a soundstage. From controlling the positioning of the sun and time of day, to bringing any location, real or otherwise-into the studio. Learn more about this powerful production tool from the team advancing it into the future.
|11:00 am - 12:00 pm||LED Volume Design Panel||NEP Virtual Studios||Moderator:
Barbara Ford Grant
|12:00 pm - 1:00 pm||Creation through Automation: How Engineering Empowers Creatives|
We truly believe that Engineering Empowers Creatives and Creatives empower Engineering to innovate. Majority of these innovations come either to achieve a specific crazy idea or from a ridiculously tedious, and repetitive process to achieve necessary results. With pipelines getting more complex and convoluted that became our reality but there are ways to solve.
With modern production pipelines growing increasingly more complex, we’ll explore some firsthand examples of how specialized tools can revolutionize production and allow creatives to spend more time creating. Learn how engineering and automation can be used to empower creatives, how creatives drive engineers to innovate, and how synergy between these two disciplines leads to better overall products.
|1:00 pm - 1:30 pm||Worlds Collide - Virtual Production in Film, Advertising, Animation & Games |
MPC is well known for big productions such as The Jungle Book and The Lion King, however, is not common knowledge that Technicolor Creative Studios is a powerhouse of brands, including companies such as The Mill, Technicolor Games and Mikros animation. This means we are involved in a myriad of different types of projects, for advertising, animation and games using real-time technologies for visualization, virtual location scouting, performance capture, AR set extensions, virtual sets, and more. Virtual Production has played a vital role in reinforcing Technicolor’s leading position in the industry as the preferred creative partner for storytellers globally.
|1:30 pm - 2:00 pm||Update on "The Rise of the Microstudio"|
As the world creates more content that needs to be stored for longer periods of time, today’s technologies are struggling to keep up. How would you store a zetta-byte of information? Our research advances the state-of-the art, creating Glass and DNA storage technologies allowing content creators to store massive volumes, at low cost, for – quite literally – ever. Join us to hear more about these cutting edge technologies and the work we are doing today, to make for a better content tomorrow.
|AWS||Hasraf ‘HaZ’ Dulull
|2:00 pm - 2:30 pm||Unity Professional Artistry Tools|
During this session, Ron Martin from Unity Technologies will present an overview of the Professional Artistry Tools including Virtual Production, Speedtree, Ziva Dynamics, and UDAM. Ron will outline the educational program for Universities and Colleges called The Unity Academic Alliance, with a focus on EdLab, the online portal for Educators.
|2:30 pm - 3:00 pm||5th Kind: 2D/3D Collaboration|
Hear how film and gaming studios are setting up the foundations for a more integrated 2D/3D pipeline starting with the VAD and how metadata can drive the orchestration of aws services.
|5th Kind||Steve Cronan|
|3:00 pm - 3:30 pm||Final Pixel in the Cloud|
AWS/ETC will discuss what new technology initiatives and challenges we are seeing for VP, how we are doing final pixel today with partners and customers, and how we put together the end to end SIGGRAPH demos to illustrate these workflows.
|3:30 pm - 4:00 pm||Jointly Optimizing Color Rendition and In-Camera Backgrounds in an RGB Virtual Production Stage|
While the LED panels used in virtual production systems can display vibrant imagery with a wide color gamut, they produce problematic color shifts when used as lighting due to their peaky spectral output from narrow-band red, green, and blue LEDs. In this work, we present an improved color calibration process for virtual production stages which ameliorates this color rendition problem while also passing through accurate in-camera background colors. We do this by optimizing linear color correction transformations for 1) the LED panel pixels visible in the field of view of the camera, 2) the pixels outside the field of view of the camera illuminating the subjects, and, as a post-process, 3) the pixel values recorded by the camera. The result is that footage shot in an RGB LED panel virtual production stage can exhibit more accurate skin tones and costume colors while still reproducing the desired colors of the in-camera background.
|4:00 pm - 4:30 pm||Connecting Virtual Production to the Cloud|
Virtual Production is advancing, and new methods for collaborative workflows and data mobility are shifting in support. The infrastructure for production stages and VFX companies is adapting to allow for secure, high bandwidth and “invisible” means of bringing optimized tools to sets and creative teams. Learn how cloud services are reflecting the needs of content creators, and how on-demand connectivity is bridging the gap.
|ETC, PacketFabric, AWS||Lisa Gerber
|4:30 pm - 5:00 pm||Challenges of Volume Control |
Virtual Production is currently deployed on major film, event and television productions around the world. Learn from the creators and operators themselves on real-life challenges, advantages and workflow recommendations for ongoing virtual production sets.
|ARRI, Voluminous||Cassidy Pearsall
Sarah Thomas Moffat
|5:30 pm - 6:00 pm||Cinecode: PreVis for TV’s Tightest Schedules, Oddest Challenges & Most Complicated Budgets|
Cinecode, the virtual production division of DigitalFilm Tree (DFT), has served story tellers on Ted Lasso, Dave on FX, The Umbrella Academy, NCIS: Los Angeles and many more. In this presentation Andrea Aniceto-Chavez, Cinecode’s Lead Producer, will unpack the differences and varied needs across PreVis, TechVis and beyond. From Cinematography to Stunts, Costume to Set Design, Cinecode by DFT re-imagines the power of PreVis for Television schedules and budgets, opening a new world of possibilities to all.
|DigitalFilm Tree||Andrea Aniceto-Chavez|
vETC at SIGGRAPH2022 - Wednesday, Aug 10th (Day 2)
|10:00 am - 10:30 am||Achieving Final Pixel|
Join Final Pixel CEO and co-founder Michael McKenna as he shares how they successfully realise 'Final Pixel' on-set by delivering a global real-time Virtual Art Department for clients - including the new partnership with Hogarth, the WPP global creative content production company. Learn about the consolidated end-to-end pipeline and workflow in pre-production that combines a new, unique, creative process with a deep understanding of virtual production stage technology & possibility - all supported by remote workflows underpinned by AWS. Gain insight into how the Final Pixel Academy is inspiring the next generation of talent to support this, including teaching 500 students aged 11-18 how to build their first metahuman.
|Final Pixel||Michael McKenna|
|10:30 am - 11:00 am||Mapping Resources for Camera & Lens Metadata|
About a year ago SMPTE started an industry collaboration around On-Set Virtual Production. Jim will give a brief overview of its progress to date with a focus on the group he chairs on improving the interoperability of camera and lens metadata. Then Sam will give an update on closely related work a group in the VES has been doing on a survey and his work on an OpenEXR mapping for camera and lens metadata.
|SMPTE RIS||Jim Helman
|11:00 am - 12:00 pm||Comandante: Near Real-Time Workflows for Virtual Production|
This talk presents our Near Real-Time (NRT) workflow to immediately improve the quality of a In-Camera VFX (ICVFX) shots, bridging on-set and post-production, allowing VFX to start on-set as soon as the director says ‘cut’. The NRT workflow brings together key advancements in lens calibration, machine learning and real-time rendering to deliver higher quality composites of what was just shot to the filmmakers, in a matter of minutes.
|12:00 pm - 1:00 pm||ICVFX & Metadata Pipelines Visualized|
Come to this very visual talk, as we map ICVFX metadata from creation to archive. Defining best practices for pipeline, data continuity, API's and compressive metadata structure.
|1:00 pm - 2:00 pm||Fathead: End-to-End Workflow |
An overview of novel virtual production pipeline innovations and technology interoperability tests for the short film Fathead conducted by the Entertainment Technology Center (ETC) at the University of Southern California (USC) and a deep dive into the virtual humans' pipeline.
|2:30 pm - 3:00 pm||Transforming Version Management and Collaboration for Film and Television|
The rise of virtual production has suddenly introduced the world of VFX to the importance of version control and the most innovative studios are learning to fully leverage it's potential. By combining over 25 years of refinement in the gaming industry with the creativity and innovation of VFX and virtual production teams, version control and asset management is revolutionizing the way we create and collaborate in the film industry. This session will look at real-life workflows that leverage Perforce to version and organize their digital assets, speed up iteration, and enable cooperation across teams and studios.
|3:00 pm - 3:30 pm||Final Pixel in the Cloud|
AWS/ETC will discuss what new technology initiatives and challenges we are seeing for VP, how we are doing final pixel today with partners and customers, and how we put together the end to end SIGGRAPH demos to illustrate these workflows.
|3:30 pm - 4:00 pm||Delivering Markerless Motion Capture at Scale:|
How to capture performances - from a sports game, a concert, or a stage performance - in high fidelity, and what does it mean for the media & entertainment industry?
|Disguise, Electronics Arts, Move.ai||Jamie Allan
|4:00 pm - 4:30 pm||XR Studios|
Join XR Studios President J.T. Rooney for a discussion on executing interactive productions using immersive technologies. XR Studios is a leading-edge digital production company specializing in immersive technology for entertainment. Known for producing Extended Reality (XR) and Augmented Reality (AR) workflow solutions, XR Studios executes innovative experiences for some of the most renowned artists and brands across the globe. From concept to completion, XR Studios is a premier, full service solutions provider led by a network of creatives and industry leaders at the core of mixed reality technology for broadcast, live, and virtual productions. Find out more at xrstudios.live.
|XR Studios||JT Rooney|
vETC at SIGGRAPH2022 - Thursday, Aug 11th (Day 3)
|10:00 am - 10:30 am||From ICVFX to Games - Bridging two worlds |
Happy Mushroom will break down assets of FatHead and how assets can transaction between VAD and game engines.
|Happy Mushroom||Felix Jorge|
|11:00 am - 12:00 pm||VP - Technology and Creative Supervising|
A discussion on VP. How real-time technology and art are changing rapidly and how Lux Machina is approaching the use of tools and techniques to make VP projects more successful.
|Lux Machina||Phil Galler
|12:00 pm - 1:00 pm||Killians Game - Sony Virtual Production Strategy|
Sony will break down the making of Killans game, tools, tech and strategy around virtual production for on set Virtual Production.
|Sony Pictures||Daniel De La Rosa|
Media, Entertainment & Broadcast Industry Lead - EMEA
|Jamie spent his formative years as a photographer and director, collaborating with many multimedia artists and music producers before shifting into technical consultancy roles for the media industries. After a decade of designing and implementing solutions for post-production, broadcast and visual effects facilities, he moved to NVIDIA where he now leads efforts to accelerate and innovate within the media, broadcast and entertainment sectors across the EMEA Region. He works with world-leading organisations to help bring their visions to life. Jamie remains an avid photographer and is highly passionate about working at the intersection of graphics, accelerated computing and AI.
Mariana Acuña Acosta
SVP, Global Virtual Production and On-Set Services
|A technologist, VR pioneer, veteran pixel pusher, Variety magazine top 10 innovator, a finalist on the Women Startup Challenge in AI, mentor for the Virtual Production Fellowships by Epic Games, software creative and entrepreneur. During her years working at Foundry, she focused on VFX & XR post-production workflows as the Head of Creative Specialists. She then became the co-founder of Glassbox a company developing tools for Virtual Production. Before her switch into tech, she worked as an on-set VFX supervisor & senior compositor and has over 16 years’ experience in the motion picture industry.
Prior to her switch into tech, she worked as an on-set VFX supervisor & senior compositor and has over 13 years’ experience in the motion picture industry working at Sony Imageworks, CIS Hollywood, Digital Domain, HBO, Columbia Pictures, Fuse FX, and others. She was featured in the books “The Future of Film”, “The Augmented Workforce”, & Voices of Labor: Creativity, Craft, and Conflict in a Global Hollywood, as well as in the acclaimed documentary "Hollywood's Greatest Tricks".
Lead Producer (Cinecode)
|Andrea Aniceto-Chavez is a graduate of NYU and the lead producer for Cinecode, the virtual production and pre-visualization effort of DigitalFilm Tree. Andrea, and Cinecode, are focused on serving the needs of productions to prototype their stories from pre-development to physical productions. Cinecode leverages game engine technology to quickly, affordably iterate their stories in the virtual environment, making physical production planning and efficiency exponentially better than current methods. By lowering the cost of pre-visualization and virtual production, Cinecode democratizes and empower creators, writers, and directors to craft their narrative with our proprietary, accessible story engine.
Vice President, Production
|Bartel, the Vice President of Production at Lux Machina and one of its Principals, has been helping pioneer in-camera visual effects on movies and shows, most recently on Black Adam, Shazam, Red Notice and Top Gun 2 through the use of Virtual Production technologies such as LED, projection, motion capture and simulcam.Wyatt's role at Lux Machina is to guide and manage the virtual production and physical production teams towards creating a cohesive and seamless preproduction and production experience for creatives through both new and traditional pipelines.As principal designer of a myriad of LED volumes and VP integrations, Wyatt has helped innovate while simultaneously working to lower the barrier of entry for Productions around the globe by working to standardize best practices and bringing years of experience to set through the power of Lux Machina's production teams.
Sr Dev Director
|I have been with EA since 1994 working in various roles over the years from artist making levels from Hogwarts Castle to Burnout tracks in the UK to managing art production in EA studios and with partners around the world. My current role is based in Vancouver, Canada managing the team that develops technology for our motion capture and 3D scanning production team. Some of this development is internal to EA, some of it involves working with talented external partners such as Move.ai.|
Director, Virtual Production
|Jason Bayever is the Director of Virtual Production at Lux Machina Consulting. He was previously a VFX Supervisor - Visual Effects Designer at Walt Disney Imagineering, Digital Effects Supervisor at Rhythm and Hues Studios and also produced the short, A Hard Place, as a 2020 Epic Games Unreal Engine (UE) Fellow (Pilot).
Jason’s credits include work on 15 feature films including Life of Pi, Night at the Museum and X2: X-Men United. He won a VES Award for Outstanding Visual Effects in a Special Venue Project for Star Wars: Rise of the Resistance (2019). He is also a three-time VES Award Nominee for: Disneyland Resort: Guardians of the Galaxy - Mission Breakout! (Outstanding Visual Effects in a Special Venue Project, 2017) and Life of Pi (Outstanding Created Environment in a Live Action Feature Motion Picture AND Outstanding FX and Simulation Animation in a Live Action Feature Motion Picture 2013).
Virtual Production Specialist & Product Owner
|Dan works closely with the Real-Time Research team to conceptualise and develop the future of filmmaking tools and workflows. He is an Unreal Engine Fellow and previously worked at Industrial Light and Magic in both Layout and Motion Capture on shows such as Star Wars and The Mandalorian. Dan went on to join Stagecraft Virtual Production on set and as part of the Virtual Art Department on productions which include The Midnight Sky and The Batman. Previously, Dan worked in both creative and pipeline technology roles at DNeg, where he contributed creatively to the award winning Ex Machina and Interstellar.|
Orbital Virtual Studios
|Chris Cope, Co-Founder, Orbital Studios
With more than three decades of experience in marketing and business development, Cope keeps the studio in Orbit. Beyond that, his mission is to bring these amazing virtual production tools to other galaxies… like Boston, Albuquerque and Atlanta, to name just a few.
|Carl has over 23 years professional experience of helping others with their technology. With a background in systems integration, his career expands over several technical disciplines including, audio/visual, LED displays, data networking, and more. His award-winning customer service approach has allowed him to work his way up from an entry-level technician to Technical Director for the Virtual Production team at Planar Systems. Carl is passionate about helping his clients learn and adopt best practices in the design, installation, and maintenance of their LED display systems.|
Daniel De La Rosa
VP, Post Production
Sony Pictures Entertainment
|Daniel De La Rosa is VP Post Production with Sony Pictures Entertainment’s Technology Development team. His current role encompasses technology and innovation strategy focused on film production including advances in virtual production, next-generation cloud-based production tools & workflows, post-production technologies, HDR color pipelines, AI & machine learning, and virtual reality and other immersive storytelling technologies.
Technology Development helps to foster innovation at Sony Pictures Entertainment through the discovery, exploration, and socialization of new technologies across the entire spectrum from content creation to distribution. Tech Dev serves as a conduit for SPE to Sony Corporate and R&D, top tier tech companies, and industry standards bodies. We strive to maintain Sony’s commitment to Kando by identifying and influencing technologies to improve efficiencies, drive quality, and enhance user experiences while protecting SPE’s valuable intellectual property.
Daniel has worked in the media and entertainment industry for more than20 years and joined Sony in 2018. Prior to Sony, Daniel worked briefly at Universal Studios in a similar capacity and at Disney where he served as production technology executive for several feature films at the Walt Disney Studios, including Tomorrowland, Alice Through the Looking Glass, and Pete’s Dragon. He also helped produce various R&D short films exploring the areas of 4K, Hybrid 3D, HDR, and ACES. He is a recipient of a Television Academy Daytime Emmy Award and Advanced Imaging Society Lumiere Award.
Hasraf ‘HaZ’ Dulull
|HaZ started his career in Video games before moving into Visual Effects, and then later transitioning over to Directing & Producing with his breakout feature film THE BEYOND, the indie sci-fi was released in 2018 and was number 2 in the iTunes Charts before being licensed on Netflix and turning into a profitable movie. His second feature film - 2036 Origin Unknown, starring Katee Sackhoff (Battlestar Galactica, The Mandalorian) earned a limited Theatrical release before landing on Netflix. He was then hired by Disney to helm the pilot for action comedy series Fast Layne, where he also served as EP / Creative Consultant on the entire series. HaZ is the co-founder of production company HaZimation producing animated feature films, series and video games based on their propriety pipeline utilising Unreal Engine. Currently in produciton of RIFT (animated feature film + video game).
Barbara Ford Grant
|Barbara Ford Grant is President, Prysm Stages, a business of NEP Virtual Studios. She is responsible for growing NEP’s global network of permanent virtual production facilities, stage partnerships and client relationships in Film and TV, and for building solutions with her team to empower artists and storytelling.
Barbara has contributed to media and entertainment for nearly 30 years across both creative and technical aspects of film production. Her experience includes emerging technologies, large-scale production operations, research and development, visual effects, animation and enterprise level management.
Before joining NEP, Barbara was Chief Technology Officer at arts and entertainment company Meow Wolf overseeing the company’s mixed-reality platform and IT divisions. Before Meow Wolf, she served as SVP, Digital Production Services at HBO overseeing studio production and post-production operations and growth, and next-gen technologies. Prior to that, she worked at entertainment companies Digital Domain, Sony Pictures Imageworks, and DreamWorks Animation, where she enabled new digital human animation and rendering, digital story and post production workflows, and simulations technologies. Her credits include groundbreaking Film and TV productions Game of Thrones, Maleficent, Alice in Wonderland and the Shrek franchise.
Barbara has been active throughout her career in recognizing and mentoring current and new talent. She is a member of both the Television Academy and the Motion Picture Academy, currently serving as Chair of the Scientific and Technical Awards Committee, is a member of the Technology Committee of the Visual Effects Society, and recently worked as a strategy consultant to MovieLabs, a non-profit R&D joint venture providing universities, corporations, start-ups and others technical guidance and funding for innovative technologies in the distribution and use of motion pictures.
She studied Art History and Photography at the University of New Mexico and currently lives in Los Angeles with her family.
NEP Virtual Studios
|As a principal and founder at Lux Machina Consulting and CTO of NEP Virtual Studios, Phil Galler strives to push the boundaries of technology and design in the entertainment industry. Focusing on in-camera visual effects, real-time mixed reality and the future of film making and live broadcast, he has been fortunate enough to find himself at the crossroads of technological innovation and creativity.Before starting Lux Machina Consulting, Phil worked as a programmer and project manager for one of the largest entertainment technology vendors in the world, handling design and execution of some of the most preeminent shows in the entertainment industry.Passionate about developing new tools that span high-end media servers, automated lighting, LED, and projection, Phil works across a wide variety of entertainment formats to tackle complex, large-scale entertainment projects and finds ways to empower Lux Machina’s clients with versatile design options, new technology, and unique solutions.|
|Lisa Gerber has been at the frontlines of entertainment since her start as a young performer. Turning to the other side of the camera, she spent two decades in TV & film casting, development, and as a loop group coordinator in post-production. In 2016, she took on a role connecting top tier technology companies and studios as Director of Strategic Partnerships at The Advanced Imaging Society. There, she shaped premiere thought leadership events, the HDR Summit, XR On the Bay and other platforms and programs centering around everything from VR to 5G. As a Co-Founder at Alchemy Tech & Entertainment Group, Lisa guided companies like Cisco, and multiple major studios through co-innovation partnerships.
As the Director of Business Development for M&E at PacketFabric, Lisa identifies use cases for an industry largely reliant on legacy means of transferring and storing large data sets. Lisa's professional specialty is in bringing disruptive solutions to companies seeking to simplify, unify and secure their work. Her passion lies in empowering the creative vision. She is a member of the Academy of Television Arts and Sciences (Interactive Media Peer Group) and lives in Los Angeles.
VP, Virtual Production
|As the VP of Virtual Production, Addy is immersed in Studio workflows and Industry standards for real time visualization for film and television. Addy is building a Disguise creator community and strategies that will further the proliferation of Virtual Production.
Prior to joining Disguise, Addy designed and built world class stages for DreamWorks, Verizon Media and NBCUniversal to help creatives visualize their dreams. Addy is well versed in VFX and animation, as well as emerging formats like AR and VR. He is an unofficial Unreal Engine Evangelist and a Fellow, and loves to converse about the possibilities of game engines as a storytelling tool.
|Mairéad Grogan is a Research Engineer at Foundry working as part of the AI Research Team. She received an MSc and PhD in Computer Science from Trinity College Dublin in 2013 and 2016, with her PhD dedicated to 3D shape and image processing. She was a postdoctoral researcher in the V-SENSE group for three years, working in various areas including light fields, color processing and image compositing. In 2020, she joined Foundry as a Research Engineer specialising in Machine Learning algorithms for 3D scene reconstruction and image processing tasks.|
|Entrepreneur, with 15+ years of experience between tech, marketing, and gaming industries managing teams and building products. He has worked as a product manager and an executive producer on multiple top-grossing mobile game titles.
In 2016 he co-founded ICVR, an award-winning software development agency focused on exploring the boundaries of emerging technologies, XR, and gaming. ICVR currently has over 90 employees with diverse skillsets and has created 90+ products in the past years for clients including major studios, A-list celebrities and brands. ICVR also devotes extensive resources to internal R&D, building proprietary pipelines and growing the skills of its team.
WW Sr. Solution Architect Leader
|Matt Herson is a World-Wide Sr. Solution Architect Leader for the Content Production, Supply Chain and Virtual Production workloads. His goal is to enabling media customers around the world to successfully run production workflows on AWS. With over 15 years of experience, Matt has a passion of innovation in the post production space. Prior to AWS, he held roles as the Director of Technology, IT Manager, DIT, and Chief Architect roles managing a large number of teams around the word.|
Animation Director - EA Create Capture
|Vincent Hung is an Animation Director at Electronic Arts. With 18 years of experience in the gaming industry, he oversees Post Production at EA’s Motion Capture Studio. His core responsibility is establishing and maintaining the output of high quality animation data for EA Development Teams. Vincent and his team of animators deliver up to 2 million character seconds of retargeted data, per fiscal year.|
Director of Photography
Orbital Virtual Studios
|Leonidas is a DP and steadicam operator in local 600. His background is in features and television including 8 feature films and tv series like Star Trek Picard, Snowfall, The Politician, and Hollywood. He’s ventured into virtual filmmaking, mastering the nuances of lighting and how to naturalize the look when working in LED volumes and adapt the technology to the demands of IATSE Directors and DOPs.|
CEO and Co-Founder
|As the CEO and Co-Founder of Happy Mushroom, Felix Jorge is a vibrant creative with a genuine passion for creating interactive story-driven content. His extensive career leading Virtual Art Department teams and developing pipelines for projects such as The Mandalorian and Jungle Book, has placed him as a pioneer in the Virtual Production landscape. His “Filmmaker First” approach has initiated a collaborative and customizable process for creative and executive leads to discover their story in an intuitive way.|
|Paul Judkins is a co-founder of DeadDrop Labs, which helps VFX, animation, and post-production studios harness the compute power, flexibility, and cost-efficiency of the Cloud. He previously worked for over 20 years as “Senior Director” of “Image Technology and Production Workflow” at IMAX where he co-invented the DMR Technology used to enhance all Hollywood films to the Large Format screen. Paul also created the IMAX 2D to 3D Technology and ran conversions of major features including Superman and Harry Potter releases. With Deaddrop he enjoys working with very talented, creative people and studios to enable them to achieve their best results.
Sr Research Scientist
|Chloe LeGendre is a Senior Research Scientist at Netflix, working in a computer graphics group focused on research at the intersection of machine learning and filmmaking. She earned her Ph.D. in Computer Science at the University of Southern California's Institute for Creative Technologies (USC ICT) in 2019, advised by Paul Debevec. She previously worked as a Senior Software Engineer at Google Research focusing on computational photography and, before her Ph.D., as a Senior Scientist in imaging and augmented reality for the digital incubator division of L’Oréal Research and Innovation.|
|Jase has worked in the film and television industry for over a decade both as a creator and a VFX pipeline engineer. He has had the privilege to work on projects for HBO, Netflix, Apple, Amazon, CBS, NBC, and Starz as well as several VES and Emmy-nominated shows. At Perforce, Jase works with cutting edge virtual production, animation, VFX, and game studios to increase the effectiveness of their pipelines and improve the quality-of-life for producers, creators, and developers.
Community & Education Manager
|For over 3 decades, Ron has worked with teams to create storytelling with CG content. Ron directed the game cinematic for Prince of Persia: Sands of Time, Rainbow 6:RavenShield, and Marvel Nemesis: Rise of the Imperfects. He worked on animated series for Mainframe Entertainment, and VFX for feature films and music videos, most recently producing real-time VFX for “The Adam Project” for Netflix and Kanye West’s music video “24” in the Unity Engine.
Ron has been a senior team member, innovating computer graphics and storytelling production with companies such as SOFTIMAGE, CBC, Global TV, Digital Domain, Marvel, UbiSoft, Microsoft, and EA. Ron has developed XR immersive experiences, produced VFX for film and television, architectural visualization, and live real-time graphics.
Traveling to over 30 countries and working with studio clients, Ron has witnessed firsthand the rise of real-time and virtual production. These travels have inspired Ron's quest to find the balance between life and work; to share how Unity’s real-time engine can empower creatives to achieve their vision. Ron is currently the Manager of Education and Community, for Unity’s Professional Arts Tools, working on building a new methodology for education in the creative industries community.
|As CEO at Final Pixel, Michael holds the vision of how Virtual Production will revolutionise filmmaking. On productions he leads as Director of Virtual Production, overseeing the VAD and OSVP for Film, TV and Advertising clients. With over 15 years industry experience including at BBC and EndemolShine, Michael has successfully grown companies producing content for Netflix, Discovery, Channel 4 and many more. He is a leader, an innovator and a disruptor, operating at a unique intersection in technology, creativity and business to produce world class content.|
|As one of the original partners at Lux Machina, Kris has been instrumental in pioneering virtual production technologies and solutions for companies such as Lucasfilm, Industrial Light and Magic, Facebook and Disney, among many others. His work has been seen by hundreds of millions around the globe. His work at Lux enables him to push the boundaries of color science, engineering and creative deployments of technology.|
Application Engineer, Systems Solutions
|Cassidy Pearsall is a member of the ARRI Solutions Group based in Burbank, CA. She works as an Applications Engineer and focuses on developing new workflows relating to lighting and camera metadata integration for virtual production in film, broadcast, and live events. Formerly an XR Support Specialist with disguise Technologies, she assisted on large-scale events and productions with mixed reality workflows, training and documentation. Winner of the 2018 DEG Hedy Lamarr Achievement Award for Emerging Innovators and a graduate of Carnegie Mellon's Drama Program for Video Design and Engineering, Cassidy strives to bring people together to create spectacular experiences, give technological agency to artists, and have fun with it along the way.|
Lead Research Engineer
|Niall Redmond is the Lead Research Engineer on the Real-Time Research team in Foundry. Niall has been at Foundry since 2017 working on a variety of research projects and now leads the Real-Time team developing tools and workflows for virtual production and game engines. Before Foundry, Niall received his PhD in Computer Graphics from Trinity College Dublin in 2011, focusing on non-photorealistic rendering. Niall then spent five years in games working on the yearly release of Football Manager at Sports Interactive.|
Head of Research
|Dan Ring is the Head of Research at Foundry, where he’s been part of the Research team since 2010. Dan delivers the vision for Research at Foundry, that is, advancing the science of media production. Dan manages products and initiatives within the Research team, as well as fostering relationships with production companies, universities and research institutes around the world.
Dan’s work has been recognised in several publications and blogs, including 3D World, Digital Arts, and fxguide. His work on planar tracking was nominated for a Scientific & Technical (SciTech) Academy Award.
Prior to Foundry, Dan was a member of the Sigmedia Research group in Trinity College Dublin and received his PhD in image processing in 2008. During that time his research focused on solving matting problems, sparse feature tracking and automatic parsing and analysis of sports and medical footage.
|Across his work on some of the largest music, corporate, broadcast, and esport projects in the world, J.T. Rooney pushes the limits of creative design and execution for live entertainment. With more than a decade of experience in the industry, J.T. leads the XR Studios team, overseeing trailblazing broadcast projects with world-renowned clients such as Twitch, Katy Perry, Snapchat, Riot Games, Billie Eilish, Viacom, Black Eyed Peas, Amazon, Tik Tok, and more.
J.T. began his career creating immersive visuals and controlled advanced systems for the New World Symphony, later joining creative agencies Lightborne and then Silent Partners Studios where he screen produced for international artists and brands including Katy Perry's Super Bowl Halftime Show, Kanye West's Yeezus and Saint Pablo Tours, the Muse Simulation Theory Tour, the 2020 MTV VMAs, and Taylor Swift's reputation Stadium Tour. Utilizing his resources and valuable experience, J.T. came together with his collaborators to create, produce, and execute productions in an entirely new way, resulting in the creation of XR Studios. Specializing in live entertainment, corporate and brand events, commercial, music video, and product work, he and his team at XR Studios continue to elevate immersive experiences utilizing modern technology in extended, augmented, and virtual reality.
J.T. currently splits his time across Montreal, London, and Los Angeles, where XR Studios is building the city's first XR Campus dedicated to mixed reality for live entertainment.
Virtual Production Producer
Orbital Virtual Studios
|Leah specializes in organization and leadership, streamlining the filmmaking process. Her virtual production experience makes her invaluable when it comes to the workflow and budgetary changes, as well as day to day operations.|
|Sam has been fascinated with visual arts, starting from an interest in photography at a young age. He eventually discovered his love for video production and visual effects which helped set the stage for a career in virtual production. Sam has worked on projects ranging from corporate events for companies like Salesforce and Intel, to episodic productions for Netflix. When not working, you can find Sam at any restaurant in the Los Angeles area that serves burritos.|
Producer & Technologist
|Indian-born American filmmaker Tom Thudiyanplackal has credits on several landmark movies, including vice president for Development and Marketing for Bollywood blockbusters 3 Idiots and Carry On Munna Bhai, and Producer of the Emmy-nominated nature documentary Seed: The Untold Story. He has traversed several industries and spearheaded creative ventures for Ogilvy Advertising, Reliance Communications, and Vinod Chopra Films in a career spanning over two decades.
After completing the Summer 2021 Unreal Engine Virtual Production fellowship with Epic Games, Thudiyanplackal joined the University of Southern California’s Entertainment Technology Center (ETC) as the Virtual Production Producer on the short film, Fathead and is leading the authorship on the white paper documenting and sharing the experiences and lessons gathered from the production.
As a producer and technologist, Thudiyanplackal is engaged in experimentations and novel pipeline integrations for virtual production, focusing on volumetric performance capture and photogrammetry of human subjects and end-to-end production methods better suited for low- to mid-budget projects centered on human stories needing a high degree of realism. He is an active member of the ETC virtual production workgroup focused on investigations into virtual production education, light and color science, and LED panel technology.
Thudiyanplackal holds a BS in Physics from Mumbai University, India, and has received formal education in direction and cinematography from Prague Film School. He has co-developed a novel mind-body self-efficacy protocol with his wife as she continues to overcome Multiple Sclerosis (MS) spiritedly. The method has earned a peer-reviewed paper published in the Journal of Evidence-Based Integrative Medicine. The curriculum for the protocol for self-care is currently taught at the University of California, San Diego (UCSD) School of Medicine.
Virtual Production Supervisor
|Kristin graduated from The Digital Animation & Visual Effects School in Orlando, FL before joining Halon Entertainment where she honed her skills while working on a variety of projects on the studio's slate including commercials, video game trailers, theme park rides, and feature films such as Logan, Ford vs. Ferrari, and The Batman. For the upcoming Robert Zemeckis film Pinocchio, she led her team to develop remote virtual camera and digital cinematography tools and workflows needed to meet the creative and logistical challenges of filmmaking during the pandemic era.
In 2022 Kristin made the move within the NEP Virtual Studios family to join Lux Machina as a Virtual Production Supervisor where she continues to help advance virtual production workflows and technology, assisting creative filmmakers of all disciplines to harness the power of virtual production for their projects.
Director, Adaptive & Virtual Production
|Erik Weaver is a specialist focused on the intersection of cloud and the Media and Entertainment industry. He currently directs ETC’s Adaptive and Virtual Production project that encompasses many aspects of the cloud, including transport, security, metadata, long-term storage, and formation of an agnostic framework that unites key vendors and studios. Prior work includes ETC’s “Production in the Cloud” as well as global M&E strategy for Western Digital. Weaver has fostered and achieved many accomplishments in the M&E market including executive producer on several ETC R&D film projects like: “Ripple Effect” (2020) that examined safety protocols and virtual production during the pandemic; “Wonder Buffalo” (2018), a production that entered the SXSW Interactivity competition, and focused on an HRD first, cloud, & volumetric capture. The project was brought together by volumetric capture, photogrammetry, ambonisic sound, and interactivity; “The Suitcase” (2016), a CAA People Choice Award and Tribeca 2017 entry that focused on HDR, cloud-based workflows and Live 360. He was also chairman of NAB’s “Next Generation Media Technologies” Conference 2014-2017, as well as chairman and initiator of vNAB 2015-2019 (later rebranded as the ongoing vETC). Weaver’s SMPTE involvement includes the Rapid Industry Solution (RIS) initiative which seeks to operate with greater agility to respond to emerging technology challenges, and the C4 MR30 Standards group, which oversees and helps educate the M&E industry on using C4, Semantics, and NoSQL for Managing Motion Picture Data. Weaver has also spoken – and continues to speak – at numerous M&E conferences and tradeshows.|
Global Leader M&E Content Production Partners
|Jack Wenzinger is responsible for AWS’ M&E Partner strategy for Content Production partners, accelerating adoption, innovation and evolution of creating great content for a global. Jack works with subject matter experts, creatives, content owners, and technology executives to address transformation challenges, providing domain expertise for cloud adoption and achievement of key business objectives. Over 25 years spent defining digitization, MAM, and media workflow solutions for studios, broadcasters, sports, news, and government agencies. More recently, Jack drives cloud enabled virtual production and is a Fellowship graduate from Epic Games Virtual Production program. Jack is always ready to talk AWS cloud, virtual production, cinematography, MAM, metadata, classic car restoration, big dogs, jeeps, Rush or other epic rock bands.
|GEAR ITEM||RESPONSIBLE PARTY|
|Production Tables + Chairs||Convention Center|
|Set Decoration||Convention Center / Set Dec|
|Tactical fiber cable||Planar|
|SilverDraft Director Server||SilverDraft|
|SilverDraft Render Server||SilverDraft|
|Blackmagic Design ATEM 4 M/E||Blackmagic|
|Blackmagic Teranex HMDI to 12G SDI||Blackmagic|
|Blackmagic Teranex 12G to HDMI||Blackmagic|
|Blackmagic Teranex Mini SDI to HDMI 8k Convertor||Blackmagic|
|Blackmagic Design Hyperdeck Studio 4k Pro||Blackmagic|
|Blackmagic Design Web Presenter 4k||Blackmagic|
|Blackmagic Rack Shelf||Blackmagic|
|SmallHD Cine 24″ 4K High Bright Pro Monitor||B&H|
|SmallHD OLED 22″ 4K Reference Pro Monitor||B&H|
|SmallHD Vision 24″ 4K HDR Pro Monitor||B&H|
|SmallHD 702 Touch 7″ On-Camera Monitor|
|24″ 4k Computer Monitors||B&H|
|Rackmount Power Strip||B&H|
|40 RU Rack||B&H|
|20A Rackmount UPS||B&H|
|2TB SSD Recording Media||B&H|
|10G Managed Switch||B&H|
|1G Managed Switch||B&H|
|Presonus StudioLive 16R||B&H|
|Presonus Stage Box||B&H|
|Sennheiser Wireless Combo HH/Lav kit||B&H|
|Sennheiser Wireless LAV Kit||B&H|
|Sennheiser Wireless LAV Kit||B&H|
|JBL EON 10″||B&H|
|JBL EON 12″||B&H|
|M10 Bolts (3pk)||B&H|
|M10 Bolts (3pk)||B&H|
|Audio Cable Trunk||B&H|
|*Ambient Master Lockit w/ Cables for Ursa 12K|
|*ZEISS CP.3 XD 4 lenses (15mm, 21mm, 35mm, 50mm)|
|Blackmagic Ursa 12k||Blackmagic|
|Zeiss CZ 28-80||B&H|
|Zeiss LWZ 21-100||B&H|
|Batteries for camera (Vmount)||B&H|
|Anton Bauer Dionic XT 150Wh V-Mount Lithium-Ion Battery||B&H|
|Anton Bauer Titon 150 V-Mount Lithium-Ion Battery||B&H|
|Core SWX HyperCore 150Wh 14.8V V-Mount Battery|
|Anton Bauer LP4 Quad Battery Charger (V-Mount)|
|V-mount for BMD Ursa||B&H|
|Blackmagic Shoulder Kit||Blackmagic|
|Cartoni Focus 22||B&H|
|SmallHD Cine 7″||B&H|
|ARRI Hi-5 Hand Unit Basic Kit without Batteries||B&H|
|Hi-5 Hand Unit Body Set||B&H|
|ARRI RF-EMIP Radio Module 2400 MHz DSSS||B&H|
|ARRI LBP Battery Charger||B&H|
|Hi-5 Hand Strap||B&H|
|ARRI Hi-5 PanzerGlass||B&H|
|Batteries for Hi-5||B&H|
|ARRI cforce mini RF Basic Set 2||B&H|
|K2.0016802 1x cforce mini RF Motor Unit||B&H|
|K2.0036610 1x cforce mini Clamp Console 2 19/15mm||B&H|
|K2.0003753 1x CLM-5/cforce mini Gear m0.8/32p, 40t||B&H|
|K2.0002007 1x 2400Mhz Antenna RPSMA, straight||B&H|
|ARRI cforce Plus Motor Basic Set||B&H|
|K2.0008819 1x cforce plus Motor Unit||B&H|
|K2.0008678 1x cforce plus Clamp Console 19/15mm||B&H|
|K2.0009335 1x cforce plus Gear m0.8/32p, 60t||B&H|
|ARRI cforce mini Motor Basic Set|
|K2.0006354 1x cforce mini Motor Unit||B&H|
|K2.0006176 1x CLM-5/cforce mini Clamp Console 19/15mm||B&H|
|K2.0003753 1x CLM-5/cforce mini Gear m0.8/32p, 40t||B&H|
|UMC 4 MDR||B&H|
|Media Management – Blackmagic Cloud Pod||Blackmagic|
|(Lighting subject to change based on DP recommendation)|
|Aputure LS 600C PRO||B&H|
|Aputure LS 300x||B&H|
|LIGHT OCTADOME 120||B&H|
|Spotlight Mini Zoom||B&H|
|1 Ton Grip pacakge or closest eqivielent (c-stands etc.)||B&H|
|ARRI Skypanel S60C||MBS|
|800W K5600 Joker Bug Lite||MBS|
|Med Lightbank ARRI Skypanel S60||MBS|