News Stories

DCS Notes – Day 1 – Session 7 – Stereography and Storytelling

Session 7: Stereography and Storytelling

Moderator(s):

Rob Engle, 3-D Visual Effects Supervisor, Sony Pictures Imageworks

Panelist(s):

Bernard Mendiburu, Analyst, Author of “3D Movie Making”

Chuck Comisky

Eric Kurland, Independent 3-DIY filmmaker, 3-DIY.com

Phil Streather, Stereo 3D Producer and Consultant, Principal Large Format

Rob Engle

“All 3D is not created equal.  It is first and foremost a very, very powerful creative storytelling tool” Jeffrey Katzenberg

3D Storytelling Choices

– Overall depth (interaxial spacing)

– Subject placement in depth (convergence)

– Roundness

– Where the screen edges are (floating windows)

– Traditional 2D composition

– Atmosphere (ex. smoke, clarity, etc.)

– Editorial pacing

– Depth transitions

Bernard Mendiburu

(Bernard worked on Meet the Robinsons, Monsters vs Aliens)

We are trying to “overcoming Millenniums of flatness” (Ray Zone).  We are where color movies were in the late 1930s.

What have we found:

– There is no screen.  We see through the screen

– Depth does not need to be realistic

– Big challenge is ‘snake oil vendors’, 3D experts since 2010

– Depth treatment must be integrated into the story

Chuck Comisky

It is really important to determine genre and style early in the process.  With Avatar the goals early on were 1) not to tire the audience’s eyes out, and 2) to within 10 minutes have the audience settle in and forget that they are watching a 3D movie.

Journey to the Center of the Earth had a lot of fun with 3D.  Early on they chose to play with the 3D in an obvious manner.

Make sure that the content is shot well; either performance capture or real capture.

In his remote stereography suite he relied on controlling knobs for three things; interoccular, convergence, and focus for the stereo pair.

They did use depth-of-field for creative purposes – soft focus in front and behind the point of interest.

3D @ Home Consortium has 10 rules for 3D that were developed by Chuck and James Cameron (http://www.3dathome.org/webpage.aspx?webpage=1952 ) (Phil Lelyveld note: there is more advice at http://www.3dathome.org/webpage.aspx?webpage=1946 )

Phil Streather

The two really interesting areas of z-space are 1) just in front of the screen to give a sense of intimacy and 2) positive parallax with parallel interoccular going to infinity rather than keystoning to infinity.

What is it about good 3D that makes it good?  Getting the math right.  He likes the idea of a depth budget.   Keep it to 1 ½ to 2% for the bulk of the feature (which works for all genres) and 5-10% for the special effects.  Ask yourself ‘who do I want in my personal space and who do I want in the behind-the-screen space?’

Eric Kurland

(Eric is a microbudget 3D filmmaker and a leader in the Los Angeles Do It Yourself 3D moviemaking community.)  There is a lot we still have to learn, but I have definitely learned that if you want to tell a good story in 3D, first and foremost you have to tell a good story.  The 3D must support the story.  3D does have a language that is different from 2D.  We all need education and practice.  I’m a big proponent of always practicing my art.  Try things and make mistakes on your own small pieces, so you don’t make mistakes on the big features.  YouTube has a 3D player now that works great.

Q&A

Do we want subtle or in-your-face 3D?  (Phil) When the seeds of life appeared around Jake in Avatar, they stayed behind the screen. If they came into the audience, the scene would have been about the seeds instead of about Jake.  (Bernard) A 3D gimmick is like a piece of candy.  Everyone loves a piece of candy, but a full bag will make you sick.

Does the up-charge for 3D theatre tickets mean that we have to give them ‘their money’s worth?’  (Chuck) We are giving them their money’s worth with the story! (Eric)  Outer space in Avatar was shot to look reel, but Hubble 3D ( now at IMAX) had a light-year interaxial distance and it completely engaged the audience.  (Bernard) Outside of Hollywood it is hard to see a 3D movie properly in the theatre.  He would like to see a trailer/intro with the image of a rainbow spanning the screen and the message ‘if you don’t see the full rainbow, including the ends, and you don’t see the full colors of the rainbow, walk out.’ (Eric) It is very important to education the exhibitors.  He went to a theatre where the left/right eyes were reversed.  He spoke to the projectionist, who didn’t consider it to be a problem.

DCS Notes – Day 1 – Session 6 – After the Capture: What Other Tools Exist?

Session 6: After the Capture: What Other Tools Exist?

Moderator(s):

Jim Whittlesey, Sr. Vice President, Technology, Deluxe Digital Media

Speaker(s):

Ray Harrisian, Stereographer, 3ality

Matthew DeJohn, VP/VFX Producer, In-Three Inc.

Peter Postma, FilmLight

Steve Owen, Director of Worldwide Marketing, Quantel

Peter Postma

Baselight is a software color correction product.  It is capable of multiple streams of 4K for color correction.  I’ll talk about what we added for 3D work.

Why would you perform stereo adjustments in a color corrector?  Because you have high performance tools (interactive), a good environment (big screen), the right talent for it, and the right time (at least in part) in the process for color and convergence issues to be addressed.

Stereographers will get eye fatigue if they wear any type of 3D glasses all day.  They can work in side-by-side or simple wipes between the two eyes.  Anaglyph is good for a quick check.  Checkerboard is useful for color adjustments – errors jump out when the content is viewed in checkerboard mode.  Color difference view and interleaved views are also useful.

Stereo tools include a multilayer multitrack timeline, so you can ‘gang color corrections.’  You can also synchronize the color grades, transferring information from one eye to the other.

Baseline is capable of adjusting for keystone, rotation, translation, and scale, and has floating window adjustment tools.

Steve Owen

The challenges of 3D post

What’s the problem?  The data needs to be perfectly synchronized, of the highest quality, comfortable to watch, meet the creative brief, and cost not much more than 2D content.

New issues for Post;

– Where does the image sit in relation to the screen? Creative choice and delivery method / screen size issues.

– Do your edits work in Stereo 3D? What are you asking the viewer’s eyes to do?

What are 5 good questions for any Post house to ask when preparing a quote for a 3D job, especially if your goal is to stay in business and make money.

1 What do the rushes look like?  ‘Fix it in post’ is the wrong attitude for 3D!  Quality control on-set is vital.  Don’t quote a job until you’ve seen the rushes.

2 What does the customer want?  For reference, Sky has posted their requirements for stereoscopic alignment on their website.

3 How much client interaction will there be?  Reliable viewing and play-out is critical to a good client experience.

4 Can your pipeline cope? (Will we make money on the job?)  3D is twice the data to manage, process, move, and store.  Quality matters like never before.  It will stress your network infrastructure, storage, and management systems.  A good 2D pipeline doesn’t guarantee success in 3D.

5 How can your technology help?  Can you do more in one suite?  Do you have real-time tools.  Stereo 3D time-saving tools: stereo color balance, geometry correction, image analysis, and tools to report divergence.

Ray Harrisian

3ality’s stereo image processing box analyzes the left/right image down to 1/100th of a degree (or whatever the unit is) for accurate alignment.

Where to place things in space may not be obvious when it is shot, but is revealed during editing.  In U2 3D every transition (e.g. many of the cuts) is a visual effect.  They did dynamic depth balancing.  For example, they created the effect in Post of Edge and Bono looking at each other across a cut.

Matthew DeJohn

Dimensionalization as a tool after capture (“Dimensionalization” is a trademarked term of In-Three!) is useful to create and alter content, ensure viewer comfort, and enhance artist control.

Three basic stages for dimensionalization:

– isolation of the elements

– depth generation

– paint process

– (sidebar: transparencies, particles, motion blur, etc. are handled both here and elsewhere)

Doing dimensionalization wrong results in:

– rubber sheet effect – foreground objects wrap around into background objects

– cardboard cutout effect – insufficient roundness

– inaccurate depth layout – conflicting depth cues

– lack of depth continuity

– bad compositing (ex. hair, hard edges that conflict with motion blur, transparencies (foregrounds sticking to background), no paint / auto paint)

Doing dimensionalization right results in:

– good depth, distinct separation, nuanced / detailed, natural fall-off, matched 2D & 3D depth cues, solid depth continuity, and depth that is adjustable to the client’s desire

– compositing is VFX-caliber, hair as good as green screen, stereo-accurate transparency, real image data for occlusions, high quality matters.

When to use Dimensionalization:

– Difficult to capture shots

– Prohibitive environments

– Prohibitive production schedule

– Benefits of traditional 2D shoot

– Flexibility in artistic decisions

– Failed stereo capture

– Alternative stereo capture

– Catalogue titles

Q & A

Regarding catching errors, how much is manual versus software-based detection.  (In-Three) human intervention is a necessary part of the process.

When dimensionalizing, how do you know what is behind the object being dimensionalized. (In-Three) You follow standard visual effects techniques and processes.

NEXT ARTICLES >

Specification for Naming VFX Image Sequences Released

ETC’s VFX Working Group has published a specification for best practices naming image sequences such as plates and comps. File naming is an essential tool for organizing the multitude of frames that are inputs and outputs from the VFX process. Prior to the publication of this specification, each organization had its own naming scheme, requiring custom processes for each partner, which often resulted in confusion and miscommunication.

The new ETC@USC specification focuses primarily on sequences of individual images. The initial use case was VFX plates, typically delivered as OpenEXR or DPX files. However, the team soon realized that the same naming conventions can apply to virtually any image sequence. Consequently, the specification was written to handle a wide array of assets and use cases.

To ensure all requirements are represented, the working group included over 2 dozen participants representing studios, VFX houses, tool creators, creatives and others.  The ETC@USC also worked closely with MovieLabs to ensure that the specification could be integrated as part of their 2030 Vision.

A key design criteria for this specification is compatibility with existing practices.  Chair of the VFX working group, Horst Sarubin of Universal Pictures, said: “Our studio is committed to being at the forefront of designing best industry practices to modernize and simplify workflows, and we believe this white paper succeeded in building a new foundation for tools to transfer files in the most efficient manner.”

This specification is compatible with other initiatives such as the Visual Effects Society (VES) Transfer Specifications. “We wanted to make it as seamless as possible for everyone to adopt this specification,” said working group co-chair and ETC@USC’s Erik Weaver. “To ensure all perspectives were represented we created a team of industry experts familiar with the handling of these materials and collaborated with a number of industry groups.”

“Collaboration between MovieLabs and important industry groups like the ETC is critical to implementing the 2030 Vision,” said Craig Seidel, SVP of MovieLabs. “This specification is a key step in defining the foundations for better software-defined workflows. We look forward to continued partnership with the ETC on implementing other critical elements of the 2030 Vision.”

The specification is available online for anyone to use.

Oops, something went wrong.