News Stories

Augmented Reality Walker – combined mobile phone and head mounted display

NTT DOCOMO exhibited the AR Walker. This is an action support system that combines a mobile phone and a compact head-mounted display.

Using this system, you can display navigation and AR content while your phone is in your pocket.

“Most previous head-mounted displays have been large, covering both eyes. But now, we’ve used technology from Olympus to make the display very small and light. This is a great advance in terms of technology. And from now on, DOCOMO will collaborate with Olympus to enhance the design of the head-mounted display, so it can be carried and worn without looking strange. We’re giving this demo to suggest how the display could be used in combination with a DOCOMO mobile phone.”

The new technology uses a slender prism called an Optical Bar. This creates two light paths to the pupil, one bringing pictures to the eye and the other bringing light from outside, achieving see-through pictures. This system makes the optics very compact, with no light loss, giving bright, clear pictures.

“The idea is to have a display covering an area of 18 cm one meter ahead. The size of the display creates the impression that you’ve put a mobile phone into it, with a 3.7 inch or 4 inch screen.”

“The head-mounted display uses an Earth’s magnetic field sensor to detect direction, so users can orient themselves just by turning their head, rather than their whole body. So we’ve built in an application where, if you face right, the display shows information about shops on your right, and if you face left, it shows information about shops on your left. If you look up, there aren’t any shops, but there is the sky, so the display shows a weather forecast.”

Because the AR Walker is so small and light, it could be used in sports as well. For example, it could support runners by showing their time and distance, course navigation, and calories consumed.

source: http://www.diginfo.tv/2010/11/04/10-0208-r-en.php

Five-Sense Theater – “Ultra-Daily” Space (3D Video, scent/wind, touch/motion)

At the Digital Content Expo 2010, the Ikei Laboratory from Tokyo Metropolitan University exhibited the Five-Sense Theater. This is a system for experiencing an “ultra-daily” space using media for all five senses.

“Here, in particular, we’re providing a supernatural experience as content. This offers not just a visual experience within built-in content, but a variety of experiences and sensations. Specifically, users can experience scents, wind, and sensations of touch. In addition, the legs can move independently, or the body can enter a world with different dimensionality. The aim of this system is to create sensations like those.”

In this exhibit, the user plays a game linked to 3D pictures and sound. For each scene, a large amount of information is controlled by the computer, enabling the user to experience a variety of sensations.

“Sensory technologies are still not very sophisticated. One issue is how to present sensations and how to render them. From now on, I think five-sense rendering will become possible, using all kinds of technologies, including IT and robotics. So it’ll be possible to experience interactive content for the five senses, such as games, in the home. It could also be used in live video communication, so you experience not just visuals, but the atmosphere as well… It’s a bit hard to describe what I mean by “communicating the atmosphere,” but I think we’ll gradually become able to do that sort of thing.”

source: http://www.diginfo.tv/2010/11/04/10-0227-f-en.php

< PREVIOUS ARTICLES NEXT ARTICLES >

Specification for Naming VFX Image Sequences Released

ETC’s VFX Working Group has published a specification for best practices naming image sequences such as plates and comps. File naming is an essential tool for organizing the multitude of frames that are inputs and outputs from the VFX process. Prior to the publication of this specification, each organization had its own naming scheme, requiring custom processes for each partner, which often resulted in confusion and miscommunication.

The new ETC@USC specification focuses primarily on sequences of individual images. The initial use case was VFX plates, typically delivered as OpenEXR or DPX files. However, the team soon realized that the same naming conventions can apply to virtually any image sequence. Consequently, the specification was written to handle a wide array of assets and use cases.

To ensure all requirements are represented, the working group included over 2 dozen participants representing studios, VFX houses, tool creators, creatives and others.  The ETC@USC also worked closely with MovieLabs to ensure that the specification could be integrated as part of their 2030 Vision.

A key design criteria for this specification is compatibility with existing practices.  Chair of the VFX working group, Horst Sarubin of Universal Pictures, said: “Our studio is committed to being at the forefront of designing best industry practices to modernize and simplify workflows, and we believe this white paper succeeded in building a new foundation for tools to transfer files in the most efficient manner.”

This specification is compatible with other initiatives such as the Visual Effects Society (VES) Transfer Specifications. “We wanted to make it as seamless as possible for everyone to adopt this specification,” said working group co-chair and ETC@USC’s Erik Weaver. “To ensure all perspectives were represented we created a team of industry experts familiar with the handling of these materials and collaborated with a number of industry groups.”

“Collaboration between MovieLabs and important industry groups like the ETC is critical to implementing the 2030 Vision,” said Craig Seidel, SVP of MovieLabs. “This specification is a key step in defining the foundations for better software-defined workflows. We look forward to continued partnership with the ETC on implementing other critical elements of the 2030 Vision.”

The specification is available online for anyone to use.

Oops, something went wrong.