Today we feature yet another amazing hack that make use of existing game engines and integrate cost-efficient, Kinect motion-capture technology. This time, Kinect is effectively integrated to Valve’s source engine and gives users motion-control over the a game’s character models. Engineer John Boiles from Yelp shares this video demonstrating the Kinect’s depth camera tracking the physical movements of the user. This data is then translated and sent to the game engine, effectively replicating the movements and showing it through the in-game character models. In the video, users can see how the character model moves exactly the way the human controller moves and also interacts with in-game objects like crates and chairs. This Kinect development gives the Kinect community a tool to experiment various features the Valve Engine and Kinect can produce. Thanks to the John Boiles and Yelp, the code is availabe for the use of all.
Here is a description of the Valve Game Engine and Kinect Project:
“My code consists of two major parts: a backend that interprets Kinect data and a Lua script that controls the game. The backend (based on one of the OpenNI example projects) reads data from the Kinect over USB, uses OpenNI to track the skeleton of the user, then sends UDP packets containing (x, y, z) coordinates corresponding to the joints in the user’s skeleton. The Lua script in Garry’s mod parses those UDP packets, then maps those coordinates to spheres that move themselves to positions corresponding to the coordinates (these are like Garry’s Mod hoverballs, but in 3d). By attaching these position-tracking balls to different entities or objects, the user can move objects in the game by moving around!”