Using gestures and voice commands to control the robot

Couple of months ago we decide to participate in Intel's perceptual computing challenge (http://perceptualchallenge.intel.com). The idea was to extend driver cockpit application to control Veterobot (http://veterobot.org) with gestures and voice recognition. We are using detected thumb position in 3D space. Position along horizontal axis is used as a steering signal and distance to the camera (depth) to control acceleration (forward and backward).
Using thumb to control steering and acceleration
It was a fun exercise but some more tuning is still required to achieve smooth control experience. So it is work in progress. More details about this work are presented in the video below.

License Content on this site is licensed under a Creative Commons Attribution 3.0 License .

Comments