News Stories

Spring 2020 Immersive Media Challenge Final Submissions, 4 Winners Picked

The Entertainment Technology Center at USC’s Spring 2020 semester Immersive Media challenge kicked off on January 29th for students and recent graduates from various disciplines. ETC’s Immersive Media Experience director Phil Lelyveld called upon submitters to ideate an immersive experience that, while not possible or easily built now, should be possible to build in 3-5 years.  New to this third ETC Immersive Media Challenge was the question; how will 5G impact the experience?

Students detailed what technological advancements need to take place in order to make their idea work and explain the reasoning behind their assumptions that those advancement will happen.

Participants were asked to answer 6 questions and create a 3-minute pitch video for the idea.

A panel of senior executives from the ETC member companies reviewed and ranked all of the submissions based on how well they accomplished the criteria of “a great idea, well told.”

The four winners are (in no particular order); Audio-Visual Animation by Jack Vomacka, Mind Palace by Sam Clempson, Affa Streaming AI Experience by Courtney Zhang, and MIC (ML for In-game Content) by Lance Newby.

Here are all of the Spring 2020 submissions.

  • Audio-Visual Animation by Jack Vomacka – Audiences of theme park attractions featuring animatronics must be kept at a sufficient distance so they do not get the chance to notice the concealed machinery. With Audio-Visual Animation, computer-generated characters will be free to move in a real-life space, no longer tethered to machinery. 6 AnswersPitch Video.
  • Telepresent Rooms by Laura Roed – Telepresent Rooms are a network of highly immersive 3D video viewing rooms, in which users can experience live events from miles away, engage in two-way 3D video calls, and share the moment across continents through the combination of LED Panel Screens, immersive audio, and gesture recognition. 6 AnswersPitch Video.

  • CleanEarth by Vivika Kapoor – This will be a game in which users can look at their phone to see their surroundings with added litter, pollution, etc through augmented reality, basically showing them what their surroundings would look like in the future if we, as a global community, don’t change our harmful habits. However, if the user performs certain tasks in the real world that help the environment, they earn in-game points and some trash or pollution will be removed from their app’s “world”. 6 AnswersPitch Video.
  • Mind Palace by Sam Clempson – Mind Palace brings memory-improvement tricks into the third dimension, by create rooms for placing memory cues in space and establishing links among the memories.  Mind Palace integrates a web browser, cloud file storage system, and more into visualized augmented reality. These spaces are interactive: objects in this augmented world are tangible and malleable layers in virtual space. 6 AnswersPitch Video.
  • Affa Streaming AI Experience by Courtney Zhang – Scroll No More, AI Affectiva Recommending Content Through Speech Emotion Detection. Affa uses a blend of personal data analytics, content metadata analysis, and emotional detection tools to make content recommendations based on your current mood. 6 AnswersPitch Video.

  • MIC (ML for In-game Content) by Lance Newby – MIC will make use of the latest in machine learning (ML) to turn your voice and normal conversations with NPCs into a natural, interactive and immersive tool for making decisions, and changing how the storyline plays out, in real-time, within the games we love and play. It turns Non-Playable Characters into minor characters who participate in and can influence the action. 6 AnswersPitch Video.

  • Legends – Meet Your Heroes by Sean Sani – My idea is to utilize advanced projector technology with some AR elements to put your heroes in front of you performing their greatest accomplishments. All in a lifelike setting where everything feels like the day it happened. Legends will put you in the room where it happens. 6 AnswersPitch Video.

Spring 2020 Immersive Media Challenge has begun

Spring 2020 Immersive Media Challenge has begun

Think like a futurist!
  • Come up with a concept for an engaging experience that should be buildable in 3-5 years
  • Explain what needs to happen that will make building it possible
  • Articulate why your assumptions are reasonable

 Instructions and details are here.

 

The deadline for your final 6 answers and the 3-minute pitch video is March 2nd (Monday midnight).
< PREVIOUS ARTICLES NEXT ARTICLES >

Specification for Naming VFX Image Sequences Released

ETC’s VFX Working Group has published a specification for best practices naming image sequences such as plates and comps. File naming is an essential tool for organizing the multitude of frames that are inputs and outputs from the VFX process. Prior to the publication of this specification, each organization had its own naming scheme, requiring custom processes for each partner, which often resulted in confusion and miscommunication.

The new ETC@USC specification focuses primarily on sequences of individual images. The initial use case was VFX plates, typically delivered as OpenEXR or DPX files. However, the team soon realized that the same naming conventions can apply to virtually any image sequence. Consequently, the specification was written to handle a wide array of assets and use cases.

To ensure all requirements are represented, the working group included over 2 dozen participants representing studios, VFX houses, tool creators, creatives and others.  The ETC@USC also worked closely with MovieLabs to ensure that the specification could be integrated as part of their 2030 Vision.

A key design criteria for this specification is compatibility with existing practices.  Chair of the VFX working group, Horst Sarubin of Universal Pictures, said: “Our studio is committed to being at the forefront of designing best industry practices to modernize and simplify workflows, and we believe this white paper succeeded in building a new foundation for tools to transfer files in the most efficient manner.”

This specification is compatible with other initiatives such as the Visual Effects Society (VES) Transfer Specifications. “We wanted to make it as seamless as possible for everyone to adopt this specification,” said working group co-chair and ETC@USC’s Erik Weaver. “To ensure all perspectives were represented we created a team of industry experts familiar with the handling of these materials and collaborated with a number of industry groups.”

“Collaboration between MovieLabs and important industry groups like the ETC is critical to implementing the 2030 Vision,” said Craig Seidel, SVP of MovieLabs. “This specification is a key step in defining the foundations for better software-defined workflows. We look forward to continued partnership with the ETC on implementing other critical elements of the 2030 Vision.”

The specification is available online for anyone to use.

Oops, something went wrong.