News Stories

"Immersive Digital Entertainment" VR System

[Philip Lelyveld comment: the combined use of an isolating head mounted display and motion capture display makes this an interesting implementation (watch the short video), although it probably could be done with an off-the-shelf HMD and a few Kinects.]

[by DigInfo TV]

Crescent, Inc. showed its Immersive Digital Entertainment VR system at the 3D & Virtual Reality Exhibition.

The Immersive Digital Entertainment system makes use of a head-mounted display, so users can experience a full 360 degree virtual space. Takahiko Akiyama of 4D Brain, the visual effects art director for the Final Fantasy movie, also worked on content development.

“Right now there are industrial designs for this kind of so-called virtual reality where a person can enter a virtual world in a 360 degree room, which are used in industry as well as medicine. We think it could also be used for entertainment. Up until now we have enjoyed these kinds of scenes on TV and in movies, but it is not like the viewer is actually inside the space on that kind of screen, so we want to make an entertainment system where people can enjoy being inside that world.”

The Immersive Digital Entertainment platform integrates a wearable head-mounted display, Virtools real-time rendering technology, image analysis technology, and super high definition motion capture cameras, all of which makes it possible to capture the users movements as well as the items they grasp in the VR space in real time.

“For example, if a new device were combined with this system the images could be made to change rapidly depending on what the user is thinking, so it could be used for relaxation or for biofeedback, as the user could see changes in the image depending on what the user is thinking, and then feel changes in himself. That would be more than a simple game, it would be a new type of entertainment, which is why we wanted to call it Immersive Digital Entertainment.”

See the original story here: http://www.diginfo.tv/2011/07/06/11-0138-f-en.php

NTU and Fraunhofer launch 3D technology virtual monitor (and Digital Media Research Center)

[By Vimita Mohandas, Channel News Asia]

To find out the meaning of a particular Chinese character, all one will have to do is point it towards a camera, and an image of what it represents will be flashed on a monitor.

Such a virtual augmented Chinese learning aid to benefit non-Chinese learners is just one of the prototypes developed by the new research centre jointly set up by Nanyang Technological University (NTU) and Fraunhofer – one of Europe’s largest research organisations – to develop 3D technology.

The S$14 million Fraunhofer Interactive Digital Media @NTU, funded by NTU, Fraunhofer and the Media Development Authority, will focus on interactive and digital media research and will also look into commercialising applications.

The centre also hopes to develop breakthrough projects for the benefit of sectors such as tourism, culture and transport.

Engaging youths will be key for the centre as media technologies constantly change.

Professor Freddy Boey, Provost-Designate at NTU, said: “Today, our young people are very media-savvy, very visual. Everybody has an iPhone, iPad and this is where we’re coming in. With good graphic improvements, it will engage our young people, not only in Singapore but the whole of Asia.”

The centre will also work on research areas like computer graphics and computer vision.

The facility will also offer joint PhD-programmes in visual computing with two leading universities from Germany and Austria.

As a start, a total of 20 scholarships will be given to promising PhD students – 10 from NTU and another 10 from the universities in Germany and Austria.

Under this programme – graduate researchers can go to Austria or Germany for at least a year to do their thesis and research at the Fraunhofer institutes there and vice versa.

Agreements for the two joint programmes with Graz University of Technology and Technische Universitat Darmsradt have been signed.

See the original story here: http://www.channelnewsasia.com/stories/singaporelocalnews/view/1136265/1/.html

< PREVIOUS ARTICLES NEXT ARTICLES >

Specification for Naming VFX Image Sequences Released

ETC’s VFX Working Group has published a specification for best practices naming image sequences such as plates and comps. File naming is an essential tool for organizing the multitude of frames that are inputs and outputs from the VFX process. Prior to the publication of this specification, each organization had its own naming scheme, requiring custom processes for each partner, which often resulted in confusion and miscommunication.

The new ETC@USC specification focuses primarily on sequences of individual images. The initial use case was VFX plates, typically delivered as OpenEXR or DPX files. However, the team soon realized that the same naming conventions can apply to virtually any image sequence. Consequently, the specification was written to handle a wide array of assets and use cases.

To ensure all requirements are represented, the working group included over 2 dozen participants representing studios, VFX houses, tool creators, creatives and others.  The ETC@USC also worked closely with MovieLabs to ensure that the specification could be integrated as part of their 2030 Vision.

A key design criteria for this specification is compatibility with existing practices.  Chair of the VFX working group, Horst Sarubin of Universal Pictures, said: “Our studio is committed to being at the forefront of designing best industry practices to modernize and simplify workflows, and we believe this white paper succeeded in building a new foundation for tools to transfer files in the most efficient manner.”

This specification is compatible with other initiatives such as the Visual Effects Society (VES) Transfer Specifications. “We wanted to make it as seamless as possible for everyone to adopt this specification,” said working group co-chair and ETC@USC’s Erik Weaver. “To ensure all perspectives were represented we created a team of industry experts familiar with the handling of these materials and collaborated with a number of industry groups.”

“Collaboration between MovieLabs and important industry groups like the ETC is critical to implementing the 2030 Vision,” said Craig Seidel, SVP of MovieLabs. “This specification is a key step in defining the foundations for better software-defined workflows. We look forward to continued partnership with the ETC on implementing other critical elements of the 2030 Vision.”

The specification is available online for anyone to use.

Oops, something went wrong.