When you put the latest development in Kinect 3d Scene Reconstruction and automated navigation, you get a preview of what reconnaissance and safety would be like in the future. Sean Anderson, Kirk Mactavish, Daryl Tiong and Aditya Sharma from the University of Waterloo banded together in order to create the most up-to-date version of 3d scene reconstruction the Kinect community has seen. The iC2020 is a mobile robot with the purpose of conducting Kinect driven mapping. Already winning numerous awards and recognitions for this project, Kinecthacks.com is proud to feature this development for the appreciation of the rest of the community.

The project opens the door to a pool of various developments and projects that iC2020 has now made possible. Using the Kinect, the team was able to have a viable instrument to map the environment and also to guide the robot. Users can now launch this drone or use this technology in order to remotely gain information about the environment.

Here is the description and the list of achievements of the iC2020:

“Our goal is to use PrimeSense technology in order to create a globally consistent dense 3D colour map.

Current accomplishments:
– Optical Flow using Shi Tomasi Corners
– Visual Odometry using Shi Tomasi and GPU SURF
. . . . Features undergo RANSAC to find inliers (in green)
. . . . Least Squares is used across all inliers to solve for rotation and translation
– Loop closure detection using a dynamic feature library
– Global Network Optimization for loop closure

Underlying Technology:
– Basic ROS Utilities
– OpenCV
– GPUSURF
– TORO”

For more information about the Kinect iC2020, visit the project’s website.

Visit Website

LEAVE A REPLY