News Stories

Silicon Image demo's SMART system at NAB

[Press Release]

Silicon Imaging, the recipient of this years International 3D Society Technology Award, announces the patent-pending SI-3D Stereo Monitoring, Analysis, Recording and Transmission (S.M.A.R.T) system . SMART captures images from RAW or HD-SDI sources, such as RED, ARRI, Sony, Panasonic, Canon, P+S and Silicon Imaging SI-2K to perform stereo analysis, image correction, 3D LUT color processing and recording, all in a single portable platform under wireless control via Apple iPad. Camera alignment and scene depth information, along with rig and lens controls are displayed graphically with the stereo video.  SMART determines residual errors caused by lens mismatch and keystones from convergence and generates real-time warped corrected video outputs.  For live-events, it can directly output side-by-side corrected material and externally encoded for Internet delivery.   Recording can be either uncompressed or visually lossless compressed into a single file encapsulating both left and right eyes along with audio, rig motion, stereo correction and timecode metadata.  The recorded content can be immediatelyedited in Final Cut, Avid, Adobe Premier Pro or other applications supporting Apple QuickTime files on a PC or Mac, or linked with 4K source material recorded in-camera.

The system captures synchronized HD-SDI streams and combines them into a variety of viewing formats such as Anaglyph, Delta, Split, Interlace and Side-by-side).  Each view is independently selectable on dual monitor outputs, for use by the director, operator and stereographer. In addition, source Left and Right inputs are directly looped out for additional distribution.  A built-in 3D LUT processor enables camera Log C or REC 709 format to be colorized with a desired look for on-set visualization. Sophisticated calibrated looks can be created using Adobe CS6 Speedgrade and fed back into post for a seamless color workflow.

The captured video is then processed by the stereo analyzer, featuring technology developed in collaboration with Fraunhofer HHI.  SMART determines camera alignment (eg. tilt, roll, height and keystones), lens focal length (zoom position) differences and relative H and V image position. Once a rig is initially aligned with its sub-pixel precision mode, cameras can be moved with inter-axial and convergence controls while the system displays a color-coded depth gauge with alarms for depth budget and geometry violations. For more detailed view of depth objects, color coded vectors can be overalyed on the image to indicate areas which would show up in front or behind the screen plane. …

Silicon Imaging will be demonstrating the SI-3D SMART system at NAB 2012 in collaboration with Fraunhofer HHI in booth C8444.

See the fu

Second screen content, coming soon to your movie theaters

[The Next Web]

The cinema advertising company Screenvision is launching a second-screen experience for movie theaters. Called The Limelight, it will let users browse additional content before their movie starts.

To access this pre-movie content, they will have to download a specific app,Screenfanz, which is already available for iOS, with an Android version coming within a few weeks.”

The app, which belongs to Screenvision, includes a large range of entertainment options, from practical information on movie sessions to gaming and social features. For instance, movie-goers will be able to check-in, share content through Facebook, earn points and compete for film tickets.

On paper, the concept sounds quite promising. While we wouldn’t necessarily want to get distracted during the movie itself, watching trailers and commercials doesn’t require more attention than TV viewing. This means that moviegoers may want to play with a second-screen app on their phone – as long as the content is interesting enough.  …

According to the company, brands that participated in the Limelight pilot research “delivered 54% unaided ad recall with nearly half intending to purchase these brands in the next year.” …

Read the full story here: http://thenextweb.com/media/2012/04/06/second-screen-content-coming-soon-to-your-movie-theaters/

< PREVIOUS ARTICLES NEXT ARTICLES >

Specification for Naming VFX Image Sequences Released

ETC’s VFX Working Group has published a specification for best practices naming image sequences such as plates and comps. File naming is an essential tool for organizing the multitude of frames that are inputs and outputs from the VFX process. Prior to the publication of this specification, each organization had its own naming scheme, requiring custom processes for each partner, which often resulted in confusion and miscommunication.

The new ETC@USC specification focuses primarily on sequences of individual images. The initial use case was VFX plates, typically delivered as OpenEXR or DPX files. However, the team soon realized that the same naming conventions can apply to virtually any image sequence. Consequently, the specification was written to handle a wide array of assets and use cases.

To ensure all requirements are represented, the working group included over 2 dozen participants representing studios, VFX houses, tool creators, creatives and others.  The ETC@USC also worked closely with MovieLabs to ensure that the specification could be integrated as part of their 2030 Vision.

A key design criteria for this specification is compatibility with existing practices.  Chair of the VFX working group, Horst Sarubin of Universal Pictures, said: “Our studio is committed to being at the forefront of designing best industry practices to modernize and simplify workflows, and we believe this white paper succeeded in building a new foundation for tools to transfer files in the most efficient manner.”

This specification is compatible with other initiatives such as the Visual Effects Society (VES) Transfer Specifications. “We wanted to make it as seamless as possible for everyone to adopt this specification,” said working group co-chair and ETC@USC’s Erik Weaver. “To ensure all perspectives were represented we created a team of industry experts familiar with the handling of these materials and collaborated with a number of industry groups.”

“Collaboration between MovieLabs and important industry groups like the ETC is critical to implementing the 2030 Vision,” said Craig Seidel, SVP of MovieLabs. “This specification is a key step in defining the foundations for better software-defined workflows. We look forward to continued partnership with the ETC on implementing other critical elements of the 2030 Vision.”

The specification is available online for anyone to use.

Oops, something went wrong.