There is something about interactive lighting that fancies the Kinect Community. Today, we feature one of the pioneers of using the Kinect’s for depth detection and position tracking and having lights adapt to these data accordingly. Meet the Tangible Table, a product of developer Przemek Jaworski. In this video release, Przemek shows how the Tangible Table program he made, detects the objects placed on a flat surface. This surface then becomes the Tangible Table, shading the objects with different hues relative to their distance to the Kinect. An orange color lights the items that are deeper while the items that are more elevated are given a blue lighting. The program was made using the openFrameworks (c++) and the CLNUI library.
Here is a description by the developer:
“The program is using point cloud generated by Kinect, and is mapping it onto the flat surface. Top part of the cloud (anything that is higher than the table surface) is marked in colour. In fact, all objects positioned on the table are simply detected automatically, and are thus trackable. What can be done with this?”
Well, we already know what developments were done in relation to this breakthrough. The question is, is there a limit to what this proof of concept can do?
For more information about the Tangible Table, visit The Tangibe Table’s Vimeo Page.