One of the advantages of living in a very connected world is that a lot of the things we usually need to leave the house for can now all be done from the comfort of our own homes. Shopping is a chore that I’m very thankful can now be done from my computer. But have you ever gotten that feeling that clicking through pages and items when browsing online was a little tedious? If you’ve answered yes, then the guys at GShopping may just have the solution for us. It’s an application that allows you to browse through screens using nothing but simple gestures.
Here are some technical details from the website:
- GShopping Application is a Demo Application built from Gesture Recognition (GR) Library. Gesture Recognition Library is implemented using Machine Learning Algorithm (Hidden Markov Model). GR Library makes use of the Microsoft NUI Library to capture data and recognize known gestures.
- Users don’t need to handle Microsoft Kinect NUI or implement their own Gesture Recognition algorithm. They only need to know how to build their own User Interface in WPF, Win Forms or .NET or simply use our GShopping Application template.
- In Demo version GShopping application five gestures: “Next”, “Previous”, “Up”,”Down” and “Enter” are predefined installed. Speech recognition API is also implemented. So App can be controlled by saying: “Next”, “Previous”, “Up”,”Down” and “Enter”.
- More user defined gestures can be added using the (GR) Library but for the current release only five gestures are installed.
While it is pitched as a shopping assistant in the video, one can also immediately see other uses for the software. It can be a tool for ads or for information services. It’s a simple yet highly versatile application that takes advantage of the Kinects many features such. I don’t think it’s that far-fetched to see this or similar applications in wide use soon.
Looks amazing