Using the EasyVR, a user would speak a color, “red”, “blue”, or “green” to indicate which of the three colored objects to look for. The mbed would control the iRobot and spin in a counterclockwise direction searching for the object with the webcam. The color filters from the RoboRealm API is used to detect the color requested. If not found, the robot would wait for another command, and if found, it would stop spinning with the object straight in front. The robot would proceed forward and a command for when to stop would be sent from the PC based on calculations received from the CoG (center of gravity) from the RoboRealm API for the distance.
Once in proximity of the object, the robot would lower the claw arm, close the claw to hold the object, and lift the claw arm. From here, the robot would search for the yellow colored bin and proceed towards it using similar procedures mentioned previously for locating color and distance. The robot would bring the arm forward and release the object in the bin and retract the claw. After, the robot would reverse a little bit and restart the sequence.
Project Website: http://users.ece.gatech.edu/~hamblen/489X/S13PROJ/RoboManageSite/