News Stories

Cure for 3D Viewing Motion Sickness in the Works in Canada

Toronto-based 3D Film Innovation Consortium has brought a host of private and public sector partners together to research stereoscopic 3D perception and provide best practices for 3D cinematography. /

A cure for nausea or headaches from 3D movie or TV viewing is in coming from Canada.

Starz Animation Toronto has teamed up with Canuck academics to ease or even end motion sickness that movie and TV audiences experience in a virtual world.

The fault, says York University film professor Ali Kazimi, lies in 3D stereoscopic cinematography.

“The reason people feel nauseous or have discomfort, headaches or eye strain is when there’s something being done improperly or incorrectly in the Stereoscopic 3D process,” Kazimi explained.

The challenge, adds Rob Burton vp of technology at Toronto-based Starz Animation, is controlling stereoscopic camera parameters to generate comfortable 3D images, where possible.

Discovering a more viewer-friendly stereoscopic film language and camera work is especially important as 3D audiences increasingly move from stationary seats in a cinema to watching stereoscopic content in homes, where neck angles and sight-lines vary greatly as viewers move round a 3D TV set.

“Unless you shoot multiple versions for different viewing venues, it’s difficult to make a one-size-fits-all model for stereoscopic 3D production,” Burton said.

To produce stereoscopic 3D imagery that leaves viewers less queasy, Starz Animation has pacted with Kazimi to make Lovebirds a 3D live action/animated short.

Burton explained Starz Animation already had a stereoscopic 3D unit in Toronto, but tended to put storytelling first before considering how the 3D production process helps tell a story.

With Lovebirds the Starz Animation crew talked extensively with Kazimi, the live action director, at the rough story-boarding stage to consider how stereoscopic 3D camera parameters might impact the story-telling, and what were the format’s potential caveats and pitfalls.

“Being able to simulate what you would get with a real stereoscopic camera in a CG environment before stepping onto a live action stage was really invaluable,” Burton said.

Love Birds, created and directed by Gary Dunn, portrays a hapless romantic bird and his first experience with dating, is conceived in a CG environment, and set against a live action background.

Stereoscopic 3D imagery helps viewers appreciate the physical size and scale of characters in a movie or TV show through the use of binocular vision.

So the Starz team and Kazimi had to consider how they wanted an audience to perceive their position in relation to a tiny 3D bird.

“Do we want the viewers to feel like their chin is on the ground looking at a bird that’s four inches tall, or do we want the viewer to feel they are on-par in terms of scale with the bird?” Burton questioned.

The decision was making the bird larger in scale in all the stereoscopic 3D camera shots to get viewers more into the two-legged animal world, starting with the opening scene.

“We chose stereoscopic camera parameters that convey the appropriate physical size and scale and, as we get into the story over the next few shots, we slowly bring the viewer down to this smaller size by manipulating the stereoscopic parameters,” Burton explained.

Kazimi adds that calculating how best to place the animated birds against the live action background when producing Lovebirds offered vital clues to the relationship between stereoscopic 3D live action and animation integration, and scaling and eye vergence.

Easing or eliminating discomfort of viewers experience with 3D content is vital, the Canadian academic insists, before stereoscopic 3D possibly gets a “bad name, and audiences get frustrated with it.

The Lovebirds shoot is part of the Toronto-based 3D Film Innovation Consortium, which has brought a host of private and public sector partners together to research stereoscopic 3D perception and provide best practices for 3D cinematography.

source: http://ca.movies.yahoo.com/news/usmovies.thehollywoodreporter.com/cure-3d-viewing-motion-sickness-works-canada

H-P Tech Officer Talks R&D Spending, 3-D Technology (6 min video and text))

Hewlett Packard’s spending on research and development isn’t a good predictor of innovation at the company, Philip McKinney, the vice president and chief technology officer of H-P’s personal systems group said in an interview with the Wall Street Journal Friday. Mr. McKinney was responding to IBM CEO Samuel Palisano’s recent remarks about what he sees as H-P’s lack of inventiveness.

“You need to look at what kind of revenue growth, what kind of margin growth you can generate off your R&D spend. Let that be the measurement ,” Mr. McKinney said. “When you’ve got 47,000 engineers, 450 PhD researchers at H-P Labs, and a culture that Bill [Hewlett] and Dave [Packard] established when H-P was created – that culture is still there,” he added.

Mr. McKinney also discussed the future of 3-D technology, saying he expects 3-D to grow more in the consumer market in the next two years. But in 2013, he said, 3-D will become more prominent in the business space, citing H-P’s work with a health care provider on 3-D medical applications as an example.

When asked about the recent spate of M&A activity in tech, Mr. McKinney said he thinks tech companies will be focused on acquisitions in security solutions in the near future. “Right now everyone’s focused on cloud computing, as obviously we are with 3PAR. But cloud computing and security solutions go hand in hand. When customers think about their data living in a cloud, they get nervous, so you need systems in place to keep that secure.”

By Lauren Goode

original post: http://blogs.wsj.com/digits/2010/09/24/h-p-tech-officer-talks-rd-spending-3-d-technology/

< PREVIOUS ARTICLES NEXT ARTICLES >

Specification for Naming VFX Image Sequences Released

ETC’s VFX Working Group has published a specification for best practices naming image sequences such as plates and comps. File naming is an essential tool for organizing the multitude of frames that are inputs and outputs from the VFX process. Prior to the publication of this specification, each organization had its own naming scheme, requiring custom processes for each partner, which often resulted in confusion and miscommunication.

The new ETC@USC specification focuses primarily on sequences of individual images. The initial use case was VFX plates, typically delivered as OpenEXR or DPX files. However, the team soon realized that the same naming conventions can apply to virtually any image sequence. Consequently, the specification was written to handle a wide array of assets and use cases.

To ensure all requirements are represented, the working group included over 2 dozen participants representing studios, VFX houses, tool creators, creatives and others.  The ETC@USC also worked closely with MovieLabs to ensure that the specification could be integrated as part of their 2030 Vision.

A key design criteria for this specification is compatibility with existing practices.  Chair of the VFX working group, Horst Sarubin of Universal Pictures, said: “Our studio is committed to being at the forefront of designing best industry practices to modernize and simplify workflows, and we believe this white paper succeeded in building a new foundation for tools to transfer files in the most efficient manner.”

This specification is compatible with other initiatives such as the Visual Effects Society (VES) Transfer Specifications. “We wanted to make it as seamless as possible for everyone to adopt this specification,” said working group co-chair and ETC@USC’s Erik Weaver. “To ensure all perspectives were represented we created a team of industry experts familiar with the handling of these materials and collaborated with a number of industry groups.”

“Collaboration between MovieLabs and important industry groups like the ETC is critical to implementing the 2030 Vision,” said Craig Seidel, SVP of MovieLabs. “This specification is a key step in defining the foundations for better software-defined workflows. We look forward to continued partnership with the ETC on implementing other critical elements of the 2030 Vision.”

The specification is available online for anyone to use.

Oops, something went wrong.