Augmented Reality Telepresence via Kinect

Vote This Hack:

1 Star2 Stars3 Stars4 Stars5 Stars
Loading ... Loading ...


Posted on 12/14/2012



0

When Google Glass was announced, people were blown-away by the tech giant’s plans for augmented reality technology. Not to be left behind, today’s featured hack is about developers creating their own version that’s focused on telepresence aspects of the technology.

Like Google Glass, users can teleconference using optical see-through head-mounted displays. However, one thing users may take issue with is that these displays are currently bulky. What does set this hack aside from Google Glass is that the remote users appear fully 3D, allowing you to look and walk around them, and are properly merged into the local environment — they can occlude and be occluded by local objects. Here are some specifics shared by our submitter:

“For example, a remote user could appear to be seated at the other end of a local table — the local user would see the remote user from the correct tracked perspective, but wouldn’t see the remote user’s legs if they appear behind the table. Other scenarios are supported as well, such as the remote user’s environment appearing as an extension of the local environment, or the local user being completely immersed in the remote scene. The 3D appearances and proper local and remote scene merger are accomplished with Kinect-based 3D scanning on both sides and projector based lighting control, which causes real objects to be illuminated only if they are not occluded by remote virtual objects.”

This one development we’re all really excited about and one that hopefully gets more focus from the Kinect Hacking community. Once perfected, it could potentially change the way we socialize and will make distance even less of a barrier.

, , ,

Leave a Reply