News Stories

MPEGIF Group to Issue 3DTV ‘Vocabulary’

Link to original post

by Deborah D. McAdams, July 28, 2010

LOS ANGELES: The MPEG Industry Forum is putting together a vocabulary for defining video quality. Motorola’s Sean McCarthy Ph.D., chairman of MPEGIF’s 3D Working Group, intends to circulate the draft for member comments this week.

“When we started talking about compression, it couldn’t be separated from video quality,” he told TVB recently. “We didn’t have a language for that. We’re looking at a creative vocabulary.”

The work is being done in conjunction with 3D@Home, a consortium of companies working on 3D content, transmission and display.

“‘Artifact,’ for example, refers to anything man-made or unnatural,” McCarthy said. “3D on a 2D display is, by it’s nature, slightly unnatural.”

With high-definition video, artifacts are referred to as “noise,” “macro-blocking,” or “motion blur.”

“Those are great for monocular,” he said. “When you talk about binocular and depth perception, you’re engaging another part of the brain.

“You might have ‘ringing’ around an edge,” he continued. “If you have a high-contrast sharp border, and you compress it too strongly, the edge will have waves or ripples. That’s a ringing artifact in 2D.”

Ringing on the dual images of stereoscopic 3D would not likely be in sync, and therefore it would produce a sparkling or fluctuating appearance to an edge where depth is supposed to be perceived.

“Depth ringing” then refers to this phenomenon in 3D.

Another term, “cardboarding,” refers to the phenomena when 3D content subjects look like flat images on a deeper flat background, like a diorama.

McCarthy said there are about four pages of unique terms for stereoscopic 3DTV. His group’s intent is to make them available across the industry in about a month.
Deborah D. McAdams

Single Truck 3D/2D capture of Sheryl Crow concert

If 3-D makes you happy

by Michael Grotticelli July 22nd, 2010

Link to original post

An intimate live show by Sheryl Crow was captured in New York City on Wednesday by All Mobile Video’s new 53ft Epic 3-D production truck, in both 2-D and 3-D, for a future edition on the PBS series “Soundstage.” The production is one of the first 2-D/3-D events captured with a single truck and represents a model of how to produce events more economically than using two separate trucks.

The concert, promoting songs from Crow’s new “100 Miles from Memphis” release, will be broadcast on PBS in 2-D in January, with the 3-D footage archived for future use such as Blu-ray distribution or carriage on a 3-D network.

The show was recorded live to tape with 13 Sony HDC-1500R HD cameras and some box-style units, all with Canon HD lenses. Ten were used in five pairs mounted on 3Ality Digital camera rigs. Two beam-splitter rigs were operated on tracking dollies to give a nice effect without having to zoom as often. Another side-by-side rig was mounted at the back of the Roseland Ballroom venue for wide shots of the stage. The 2-D production made use of all 13 HD camera views, including one mounted on a boom and two used handheld. None of the 3-D rigs were operated from the shoulder.

The truck features Sony’s new MVS-8000X production switcher, SRW-5800 HDCAM SR recording decks and several prototype Luma 3-D production monitors (production models will be available this fall). The one switcher was used for both the 2-D and 3-D shows, which were taped as ISO records. There’s also a Pesa 480 x 480 router and Studer Vista 8 digital audio console. The Epic 3-D truck also includes a converge area in the middle of the truck, where one technician per camera rig had to tweak the left and right signal to make sure it looked good in 3-D, before the director included it into the final show. A separate 61ft “B” unit is being built to house convergence operators, in order to handle larger productions.

HD Ready, an Illinois-based post-production company (which has posted several other 3-D events, including a recent Kenny Cheney concert), produced the Crow show under the supervision of Joe Thomas, a director by trade and a founder of HD Ready. He directed the Crow concert from the truck, carefully instructing camera operators on how to best frame their shots, and seemed excited about the concept of practically producing two shows simultaneously.

“We used some of the left-eye camera on the 3-D rigs for our 2-D show, and everything went very smoothly,” Thomas said. “Contrary to what others are saying, I don’t think you need to stay on shots longer with 3-D; you just have to make sure all of your camera views complement each other, which takes more work and a lot of time. Proper framing is very important. Luckily, we were not broadcasting a live show, so we could control things a lot more.”

He said that many of the things directors might want to get rid of in a 2-D shot, such as microphone stands and lighting trusses, they should keep in 3-D to make it interesting to watch because they add depth to the shot.

Prior to the show airing on PBS, the show will be post produced at HD Ready using a Quantel Pablo system to boost the 3-D effects and fix any uncomfortable viewing issues. “In post we’re adding another layer of work that has to be done, because 3-D done wrong looks terrible.”

Set up for the production occurred a day before, with slightly more time required when compared to preparations for a typical 2-D HD show. Eric Duke, president of AMV, said the other advantage of the Roseland Ballroom is that the venue is intimate and rather wide, allowing them to move the cameras closer to the stage and place them in ideal positions, without having to kill existing seating as some 3-D productions have had to do. Duke called the Roseland Ballroom “very 3-D friendly.”

“There’s no question that losing house seats is a major issue we have to resolve when doing 3-D projects, because some venues are not willing to give up revenue-generating, premium seats,” he said. “With this show, we could set up the seating the way we needed to support our 3-D shots. That makes a big difference.”

The stage was outfitted with extra truss pieces and lighting to support the 3-D production, while a circular truss was installed in the ceiling that provided a point of reference for viewers by shooting through the truss in some shots.

Everyone involved with the production agrees that 3-D imagery puts viewers “at the concert” in ways 2-D viewers just can’t experience. “It gives you that added sense of realism and depth that fans who can’t make the shows really appreciate,” said Jason Goodman, stereographer for the production and CEO of 21st Century 3D (in New York).

While it began in the mid-’70s, “Soundstage” was reborn in 2001 thanks to a new partnership between WTTW National Productions and HD Ready. Thomas’ original vision was to combine the one-hour musical performances of the original show with state-of-the-art HD video equipment and innovative Dolby 5.1 audio. The majority of the concerts are recorded before intimate studio audiences at WTTW’s Grainger Studio in Chicago, but “Soundstage” occasionally hits the road.

Thomas said 3-D continues that same concept of keeping the series fresh, and the technology is now becoming more accepted by major artists like Sheryl Crow, so he anticipates more concerts to come in the near future.

< PREVIOUS ARTICLES NEXT ARTICLES >

Specification for Naming VFX Image Sequences Released

ETC’s VFX Working Group has published a specification for best practices naming image sequences such as plates and comps. File naming is an essential tool for organizing the multitude of frames that are inputs and outputs from the VFX process. Prior to the publication of this specification, each organization had its own naming scheme, requiring custom processes for each partner, which often resulted in confusion and miscommunication.

The new ETC@USC specification focuses primarily on sequences of individual images. The initial use case was VFX plates, typically delivered as OpenEXR or DPX files. However, the team soon realized that the same naming conventions can apply to virtually any image sequence. Consequently, the specification was written to handle a wide array of assets and use cases.

To ensure all requirements are represented, the working group included over 2 dozen participants representing studios, VFX houses, tool creators, creatives and others.  The ETC@USC also worked closely with MovieLabs to ensure that the specification could be integrated as part of their 2030 Vision.

A key design criteria for this specification is compatibility with existing practices.  Chair of the VFX working group, Horst Sarubin of Universal Pictures, said: “Our studio is committed to being at the forefront of designing best industry practices to modernize and simplify workflows, and we believe this white paper succeeded in building a new foundation for tools to transfer files in the most efficient manner.”

This specification is compatible with other initiatives such as the Visual Effects Society (VES) Transfer Specifications. “We wanted to make it as seamless as possible for everyone to adopt this specification,” said working group co-chair and ETC@USC’s Erik Weaver. “To ensure all perspectives were represented we created a team of industry experts familiar with the handling of these materials and collaborated with a number of industry groups.”

“Collaboration between MovieLabs and important industry groups like the ETC is critical to implementing the 2030 Vision,” said Craig Seidel, SVP of MovieLabs. “This specification is a key step in defining the foundations for better software-defined workflows. We look forward to continued partnership with the ETC on implementing other critical elements of the 2030 Vision.”

The specification is available online for anyone to use.

Oops, something went wrong.