Through implementing parts of the formalism, it was shown that a mobile robot can learn to perceive the traversability affordances after interacting with its environment. The results demonstrated that the robot was able to achieve perceptual economy after learning and that the affordances generalized well in the real world. In our subsequent research, we used different robot manipulators (including the iCub humanoid robot) to learn the manipulative affordances, such as push-ability, reach-ability and grasp-ability, of objects. We have shown that, the affordances that were learned allowed the robot to do goal emulation and planning in its perceptual space.
Specifically, we utilized the notion of affordances to argue that verbs tend to refer to the generation of a specific type of effect rather than a specific type of action. Then, we show how a robot can form these concepts through interactions with the environment and how humans can use these concepts to ease their communication with the robots. We also took a novel approach to nouns that specify different classes of object, by arguing that novel objects should be classified not through their visual looks (as is usually done in object recognition studies in computer vision) but through the affordances it offers to the robot.
We demonstrated on the iCub humanoid robot platform that the concepts that the robot develops can be used to understand what a human performs, perform multi-step planning for reaching a goal state as well as to specify a goal to the robot using symbolic descriptions. In our final review demonstration, we have shown that the robot can respond successfully to commands such as “grasp ball”, by identifying a ball-like object (that is rollable) on its table and choosing the proper action (top-grasp instead of a side-grasp) to achieve its goal.
We proposed a coordination behavior that makes a swarm of mobile robots, initially connected via proximal sensing, be able to wander in an environment by moving as a coherent group in open space and to avoid obstacles as if it were a “super-organism”. Different from prior flocking studies, which relied on designated or elected leader within the group, or the use of a common goal/homing direction, our coordination behavior operated in a distributed manner with completely on-board sensing, making it the first successful self-organized flocking in literature.
Recently, Turkish Air Industries funded a project to develop a proof-of-concept system towards the coordinated flying of a group of quadcopters. Despite the lack of any prior experience, we were able to demonstrate the fully autonomous flying of three quadcopters in only 6 months. My research in this track is moving towards the development of on-board sensing/signaling systems to implement flocking.