News Stories

3D Stereo Workflow with the GoPro Hero2

[ProvideoCoalition]

Whether you love it or hate it, 3D Stereography is here for awhile. And if you’re totally into it like I am, you quickly realize that there are few turnkey workflows out there to capture and process 3D Stereo video that’s easy to setup, shoot and edit Stereo pairs. Sure, there are a lot of high-end (expensive to rent) systems for two cameras to shoot, software to sync/mux the footage and rigs you can build to edit it, but GoPro has brought it together with a fun and easy-to-use system that anyone can use.

Read the full, lengthy article and watch the videos here: http://provideocoalition.com/index.php/lightscameraaction/story/stereo_3d_with_the_gopro_hero2/

AR goggles make crime scene investigation a desk job

[NewScientist]

CRIME scene investigators could one day help solve murders without leaving the office. A pair of augmented reality glasses could allow local police to virtually tag objects in a crime scene, and build a clean record of the scene in 3D video before evidence is removed for processing.

The system, being developed by Oytun Akman and colleagues at the Delft University of Technology in the Netherlands, consists of a head-mounted display receiving 3D video from a pair of attached cameras controlled by a laptop carried in a backpack. This arrangement lets the wearer see their surroundings as normal while also allowing them to overlay virtual objects, which are placed using hand gestures.

A menu appears to float over the left hand when the wearer holds it in front of them. Moving the left hand back and forth selects from a variety of tools, while the right hand serves as a pointer to tag objects in the scene, like blood spatter or bullet holes. The system stores the markers as part of a 3D model of the scene, which investigators can use to help their investigation. It may also be admissible in court as evidence.

If the person wearing the glasses requires assistance, they can contact someone back in the lab who can watch their video stream, speak to the wearer through a headset and place markers in the scene using a mouse and keyboard. This would also allow a police officer to take the first look around a crime scene.

Read the full article here: http://www.newscientist.com/article/mg21328495.700-ar-goggles-make-crime-scene-investigation-a-desk-job.html?DCMP=OTC-rss&nsref=online-news

< PREVIOUS ARTICLES NEXT ARTICLES >

Specification for Naming VFX Image Sequences Released

ETC’s VFX Working Group has published a specification for best practices naming image sequences such as plates and comps. File naming is an essential tool for organizing the multitude of frames that are inputs and outputs from the VFX process. Prior to the publication of this specification, each organization had its own naming scheme, requiring custom processes for each partner, which often resulted in confusion and miscommunication.

The new ETC@USC specification focuses primarily on sequences of individual images. The initial use case was VFX plates, typically delivered as OpenEXR or DPX files. However, the team soon realized that the same naming conventions can apply to virtually any image sequence. Consequently, the specification was written to handle a wide array of assets and use cases.

To ensure all requirements are represented, the working group included over 2 dozen participants representing studios, VFX houses, tool creators, creatives and others.  The ETC@USC also worked closely with MovieLabs to ensure that the specification could be integrated as part of their 2030 Vision.

A key design criteria for this specification is compatibility with existing practices.  Chair of the VFX working group, Horst Sarubin of Universal Pictures, said: “Our studio is committed to being at the forefront of designing best industry practices to modernize and simplify workflows, and we believe this white paper succeeded in building a new foundation for tools to transfer files in the most efficient manner.”

This specification is compatible with other initiatives such as the Visual Effects Society (VES) Transfer Specifications. “We wanted to make it as seamless as possible for everyone to adopt this specification,” said working group co-chair and ETC@USC’s Erik Weaver. “To ensure all perspectives were represented we created a team of industry experts familiar with the handling of these materials and collaborated with a number of industry groups.”

“Collaboration between MovieLabs and important industry groups like the ETC is critical to implementing the 2030 Vision,” said Craig Seidel, SVP of MovieLabs. “This specification is a key step in defining the foundations for better software-defined workflows. We look forward to continued partnership with the ETC on implementing other critical elements of the 2030 Vision.”

The specification is available online for anyone to use.

Oops, something went wrong.