Polly was the first mobile robot to move at animal-like speeds (1m per second) using computer vision for its navigation. It was an example of behavior-based robotics. Horswill's PhD supervisors were Rodney Brooks and Lynn Andrea Stein. For a few years Polly gave tours of the AI laboratory's seventh floor, using canned speech to point out landmarks such as Anita Flynn's office. When someone approached Polly, it would introduce itself and offer a tour, asking them to answer by waving their foot.[1]
The "Polly algorithm" is a way to navigate in a cluttered space using very low resolution vision to find uncluttered areas to move forward into, assuming that the pixels at the bottom of the frame (the closest to the robot) show an example of an uncluttered area. Since this could be done 60 times a second, the algorithm only needed to discriminate three categories: telling the robot at each instant to go straight, towards the right or towards the left.
Polly was built from minimalist machinery and runs on a hardware platform that could be duplicated for less than $10,000.[1] The machine was intended to show that very simple visual machinery can be used to solve real tasks in unmodified environments.[1]
References
^ abcHorswill, Ian. "Polly: A vision-based artificial agent." Proceedings of the National Conference on Artificial Intelligence (AAAI). 1993.