FAAST on your PC

Install FAAST on your PC for full body control and VR applications:

Disclaimer:

This comprehensive guide of FAAST installation was taken from Institute for Creative Technologies website. The toolkit of the FAAST was also taken from the same site. By sharing their guide to the community, Kinecthacks.com wishes to promote the value of innovation and creativity with the Micrsoft’s Kinect. All credit for this guide and FAAST downloads belong to the Institute of Creative Technologies.

Summary

FAAST is middleware to facilitate integration of full-body control with games and VR applications. The toolkit relies upon software from OpenNI and PrimeSense to track the user’s motion using the PrimeSensor or the Microsoft Kinect sensors. FAAST includes a custom VRPN server to stream the user’s skeleton over a network, allowing VR applications to read the skeletal joints as trackers using any VRPN client. Additionally, the toolkit can also emulate keyboard input triggered by body posture and specific gestures. This allows the user add custom body-based control mechanisms to existing off-the-shelf games that do not provide official support for depth sensors.

FAAST is free to use and distribute for research and noncommercial purposes (for commercial uses, please contact us). If you use FAAST to support your research project, we request that any publications resulting from the use of this software include a reference to the toolkit (a tech report will be posted here within the next week or so). Additionally, please send us an email about your project, so we can compile a list of projects that use FAAST. This will be help us pursue funding to maintain the software and add new functionality.

The preliminary version of FAAST is currently available for Windows only. We are currently preparing to release code as an open-source project. Additionally, we plan to develop a Linux port in the near future.

Installation

To use FAAST, you will need to download and install the following software:

  1. OpenNI Unstable Build for Windows v1.0.0.25
  2. PrimeSense NITE Unstable Build for Windows v1.3.0.18
    During NITE installation, enter the free license key from OpenNI: 0KOIk2JeIBYClPWVnMoRKn5cdY4=
  3. Hardware drivers for your sensor (only one of the following)

FAAST should then run out-of-the-box; no additional installation or setup is necessary. If you encounter an error on startup, you may also need to install the Microsoft Visual C++ 2008 SP1 Redistributable Package.

Skeleton Usage

FAAST streams the user’s skeleton over a VRPN server identified as “Tracker0@ip_address” (“Tracker0@localhost” if running on the same machine as the client). The server automatically starts when the toolkit connects to a sensor. A total of 24 skeleton joint transformations (including position and rotation) are streamed as sensors. Corresponding to the OpenNI framework, the joints are ordered as follows:

Sensor Joint Sensor Joint
0 Head 12 Right Elbow
1 Neck 13 Right Wrist
2 Torso 14 Right Hand
3 Waist 15 Right Fingertip
4 Left Collar 16 Left Hip
5 Left Shoulder 17 Left Knee
6 Left Elbow 18 Left Ankle
7 Left Wrist 19 Left Foot
8 Left Hand 20 Right Hip
9 Left Fingertip 21 Right Knee
10 Right Collar 22 Right Ankle
11 Right Shoulder 23 Right Foot

Note: The joint positions and orientations reported by OpenNI are relative to the world coordinate system, not relative to the parent joint’s coordinate system. More information can be found in the OpenNI/NITE documentation. Positions are reported in meters relative to the sensor’s position, in accordance with the VRPN standard units. To stream over VRPN, we convert the orientations reported by OpenNI from rotation matrices to quaternions. As we have not yet rigorously tested this conversion for coordinate system correctness, we appreciate any reported bugs or feedback from the community.

Currently, FAAST streams the entire skeleton for the first calibrated user that is currently visible to the sensor. To calibrate, the user must hold a ‘Psi’ pose for several seconds until a stick figure appears, as shown in the image below. We are in the process of developing an interface to select multiple users and specific joints to stream over the server.


Skeleton Calibration Pose

Input Emulator Usage

While we eventually intend to provide a GUI for configuring the input emulator, the default event bindings are currently set in an external file (Bindings.txt). The toolkit provides a bindings text editor that allows the controls for different applications to be saved and loaded. The file contains a list of bindings entries, one entry per line, with a ‘#’ at the beginning of the line indicating a comment.  When the user clicks “Start Input Emulation,” FAAST will begin sending virtual input events to the actively selected window. The window to receive the events must have the focus (i.e. you must click on the receiving window after starting the emulator).

For keyboard and mouse click events, the syntax for each entry is as follows:

action_name      action_threshold      virtual_event_type      virtual_event_name

action_name – The name of the pose or gesture action that the user performs.

action_threshold – The minimum threshold for activation (this varies by event).

virtual_event_type – The type of virtual event to be generated.

virtual_event_name – The specific event to be generated.

Skeleton Actions require the user to perform skeleton calibration, and will work automatically once the skeleton is acquired. NITE Actions are supported by the PrimeSense gesture recognizer, and require no skeleton or calibration; instead, the user performs a focus gesture to begin recognition of these actions. The focus gesture is currently hard-coded as a ‘wave’ action. More information about NITE gestures and focus can be found in the NITE documentation from PrimeSense.

Action List

Action Name Type Threshold
lean_left skeleton angular body lean left (degrees)
lean_right skeleton angular body lean right(degrees)
lean_forwards skeleton angualr body lean forwards (degrees)
lean_backwards skeleton angular body lean back (degrees)
left_arm_forwards skeleton forward distance from left hand to shoulder (inches)
left_arm_down skeleton downward distance from left hand to shoulder (inches)
left_arm_up skeleton upward distance from left hand to shoulder (inches)
left_arm_out skeleton sideways distance from left hand to shoulder (inches)
left_arm_across skeleton sideways distance from left hand across body to shoulder (inches)
right_arm_forwards skeleton forward distance from right hand to shoulder (inches)
right_arm_down skeleton downward distance from right hand to shoulder (inches)
right_arm_up skeleton upward distance from right hand to shoulder (inches)
right_arm_out skeleton sideways distance from right hand to shoulder (inches)
right_arm_across skeleton sideways distance from right hand across body to shoulder (inches)
left_foot_forwards skeleton forward distance from left hip to foot (inches)
left_foot_sideways skeleton sideways distance from left hip to foot (inches)
left_foot_backwards skeleton backwards distance from left hip to foot (inches)
left_foot_up skeleton height of left foot above other foot on ground (inches)
right_foot_forwards skeleton forward distance from right hip to foot (inches)
right_foot_sideways skeleton sideways distance from right hip to foot (inches)
right_foot_backwards skeleton backwards distance from right hip to foot (inches)
right_foot_up skeleton height of right foot above other foot on ground (inches)
jump skeleton height of both feet above ground (inches)
walk skeleton height of each step above ground when walking in place (inches)
push NITE velocity (inches/sec.)
swipe_up NITE velocity (inches/sec.)
swipe_down NITE velocity (inches/sec.)
swipe_left NITE velocity (inches/sec.)
swipe_right NITE velocity (inches/sec.)
circle NITE radius (inches)
wave NITE n/a (leave at 0)

Virtual Event List

Virtual Event Type Virtual Event Name
key key to press (either a single character or a special key from the table below)
mouse_click mouse button to click (left_button, right_button, or middle_button)

Special Keys

For example, the following command will press the “w” key when the right hand extends more than 18 inches in front of the shoulder, and then release it when it returns back behind that distance:

right_arm_forwards 18 key w

In this example, the following command will press the left mouse button when the left foot kicks forwards more than 12 inches in front of the hip, then release it when it returns back behind that distance:

right_arm_forwards 12 mouse_click left_button

Two mouse motion behaviors are currently supported.  Absolute motion maps the space around the body directly into screen coordinates, which is good for controlling Windows applications.  Relative motion moves the mouse continuously based on the position of the tracked body part, which is good for controlling aiming and camera control in 3D games.  For mouse movement events, the syntax for each entry is as follows:

body_movement      movement_range      mouse_movement_type      mouse_parameters

body_movement – The body movement that should control the mouse.

movement_range – The range of movement, which is the real distance that should be mapped to mouse motion.

mouse_movement_type – Either absolute or relative mouse motion.

mouse_parameters – Parameters that determine mouse behavior.

Body Movements

Body Movement Movement Range
left_hand_move distance from left shoulder to left hand (inches)
right_hand_move distance from right shoulder to right hand (inches)

Mouse Movements

Mouse Movement Type Number of Parameters Parameter 1 Parameter 2
mouse_move_absolute 1 minimum threshold for mouse motion
(percentage of screen in decimal format)
mouse_move_relative 2 speed at movement range distance
(number of pixels)
minimum threshold for mouse motion
(number of pixels)

For example, the following command will control the mouse directly over the desktop screen by moving the left hand in an area within 18-inches of your shoulder.  The minimum threshold value of 0.01 specifies that movements of less than 1% of total screen dimension are ignored, which reduces jitter.

left_hand_move 18 mouse_move_absolute 0.01

In the following example, the right hand continuously moves the mouse faster as the distance away from the shoulder increases.  The speed of the mouse motion at 18 inches away is 50 pixels (acceleration up to that point is linear).  The minimum threshold value of 2 specifies “dead zone” of 2 pixels, allowing the mouse to remain steady if the hand is held still at the center.

right_hand_move 18 mouse_move_relative 50 2

Future Work

  • Substantially expanded pose and gesture sets, including more “high-level” gestures
  • Training custom user-defined gestures
  • Real-time head tracking using the sensor’s RGB camera
  • Input emulator support for virtual mouse and joystick events
  • Adding GUI for configuring input emulator bindings
  • Adding GUI for configuring VRPN streaming for specific users and skeletal joints
  • Streaming hand positions without skeleton calibration
  • 3D reconstruction / simultaneous localization and mapping (SLAM)
  • Control of the Kinect motor

For more information about the guide, visit the Institute of Creative Technologies Website. You can download FAAST here. 3