The romote control has come along way from the first remote patent by Nikola Tesla in 1898 and the first wireless TV remote “The Flashmatic” that used photon cells as shown in the picture below.
Fast forward 100 years and remote controls are breaking new ground for Human-Robotic Interaction (HRI) by leveraging Natural User Interface (NUI) allowing users to carry out relatively natural motions, movements or gestures which in return control computer applications, manipulate on-screen content or in this case control a robot.
EDDIE, the reference platform used for this demo comes with an 8-core Propeller microcontroller to directly control two 12v motors. These motors can be control remotely or Eddie can roam autonomously by leveraging several sensors around the robot and see in 3D using Microsoft Kinect. Gershon Parent , a developer with the Microsoft Robotics group, has added a new twist on how EDDIE can be wirelessly controlled which he’s dubbed the “Motion Tracking Robot Controller”. By leveraging skeletal tracking through a Kinect sensor Gershon can control the two 12v motors through arm gestures navigating EDDIE through his environment.
When standing in front of the Kinect sensor, Gershon’s right hand controls the right motor and his left hand controls the left motor; it’s kind of like a tank driver. When he raises both arms simultaneously the robot will move forward in a straight line and the higher he raises his hands the faster the robot will go. To put the robot in reverse he simply lowers both of his arms at the same time. To turn the robot all he has to do is tilt his hands one over the other from side to side to give the robot the desired degree of turn.
Gershon’s demo is a great example on how NUI can be used to control a robot, but it’s just the tip of the iceberg. There are many different types of controls that can be built through a Kinect interface – a steering wheel or joystick type of control, for example. Also, as shown in the video the robot has a Kinect sensor on it where it could be sensing its own environment, detecting obstacles and relaying this information back to the user. EDDIE’s Control Board provides additional I/O allowing a wide variety of sensors and accessories like cameras as seen before in Roborazzi.
Do you have an idea like the Motion Tracking Robot Controller? We would love to hear from you and you can submit an entry in our Robotics @Home Contest. And if you’re already using RDS4, we hope you’ll join our developer community for any technical assistance you might need. And finally, we are always looking to hear from our community; reach out to us anytime on Facebook and Twitter if you want to learn more about what’s going on at Microsoft Robotics or to Geek out on robots with us.