News Stories

Microsoft Kinect Hacked for 3D Video Conferencing Use

[by Robert Archer]

When Microsoft released the Kinect hands-free accessory the gaming community applauded the company’s latest gaming product.

Techies outside of the gaming community are also embracing the device for reasons beyond gaming activities. The website gmanews.tv is reporting that Microsoft’s Kinect was recently hacked by a University of North Carolina (UNC) at Chapel Hill professor and student to run as a 3D video conferencing tool.

UNC graduate student Andrew Maimone says the 3D video conferencing system utilizes the Kinect’s depth cameras along with some algorithms and filters to create a 3D video conferencing solution. “Our system is affordable and reproducible; [it] offers the opportunity to easily deliver 3D telepresence beyond the researchers lab,” reports gmanews.tv.

Maimone’s comments were pulled from an abstract he wrote that describes the methodology that was used to develop the 3D video conferencing system.

The system’s algorithm is said to merge data between multiple depth cameras and it will work with color calibration techniques, as well as technologies that perserve stereo images with low data rendering rates.

In addition to these technologies, Maimone also presents a Kinect-based markerless tracking system that combines 2D eye-recognition with depth content to enable head-tracked stereoscopic views to be produced for a parallax barrier autostereoscopic display.

The Philippines-oriented website adds that UNC’s hack of the Kinect is not the first time the Kinect has been compromised. According to gmanews.tv, a group of Massachusetts Institute of Technology (MIT) students hacked the gaming accessory to “enhance” distance-based Internet communications.

See the original post here: http://www.explore3dtv.com/blog/entry/20708/Microsoft-Kinect-Hacked-for-3D-Video-Conferencing-Use/

After Final Cut Pro debacle, does Apple still care about creative pros?

[By ]

[Excerpt]

Unless you’ve been under a low-tech, Internet-less rock for the past two weeks, you’ve probably heard about Final Cut Pro X, the latest version of Apple’s professional video app, if only in passing. Let me bring you up to speed if you don’t already know how the drama unfolded: FCP X was met with mixed reactions, most of them negative, thanks to a healthy set of missing features that video professionals rely on. These span the gamut from not being able to open legacy projects (very bad) to missing tape support (not as bad). But the different interface and workflows in the new version made it clear that this wasn’t just a high-end product missing a few features; it was a completely new direction for Final Cut Pro, and it was aimed at the increasing prosumer market. It was “iMovie Pro,” whether that sounds derogatory or not.

Read the full, lengthy story here: http://arstechnica.com/apple/guides/2011/07/does-apple-still-care-about-creative-pros.ars?utm_source=rss&utm_medium=rss&utm_campaign=rss

< PREVIOUS ARTICLES NEXT ARTICLES >

Specification for Naming VFX Image Sequences Released

ETC’s VFX Working Group has published a specification for best practices naming image sequences such as plates and comps. File naming is an essential tool for organizing the multitude of frames that are inputs and outputs from the VFX process. Prior to the publication of this specification, each organization had its own naming scheme, requiring custom processes for each partner, which often resulted in confusion and miscommunication.

The new ETC@USC specification focuses primarily on sequences of individual images. The initial use case was VFX plates, typically delivered as OpenEXR or DPX files. However, the team soon realized that the same naming conventions can apply to virtually any image sequence. Consequently, the specification was written to handle a wide array of assets and use cases.

To ensure all requirements are represented, the working group included over 2 dozen participants representing studios, VFX houses, tool creators, creatives and others.  The ETC@USC also worked closely with MovieLabs to ensure that the specification could be integrated as part of their 2030 Vision.

A key design criteria for this specification is compatibility with existing practices.  Chair of the VFX working group, Horst Sarubin of Universal Pictures, said: “Our studio is committed to being at the forefront of designing best industry practices to modernize and simplify workflows, and we believe this white paper succeeded in building a new foundation for tools to transfer files in the most efficient manner.”

This specification is compatible with other initiatives such as the Visual Effects Society (VES) Transfer Specifications. “We wanted to make it as seamless as possible for everyone to adopt this specification,” said working group co-chair and ETC@USC’s Erik Weaver. “To ensure all perspectives were represented we created a team of industry experts familiar with the handling of these materials and collaborated with a number of industry groups.”

“Collaboration between MovieLabs and important industry groups like the ETC is critical to implementing the 2030 Vision,” said Craig Seidel, SVP of MovieLabs. “This specification is a key step in defining the foundations for better software-defined workflows. We look forward to continued partnership with the ETC on implementing other critical elements of the 2030 Vision.”

The specification is available online for anyone to use.

Oops, something went wrong.