Dr Caroline Lyon, Professor Chrystopher Nehaniv and Dr Joe Saunders have carried out experiments as part of the iTalk project with the childlike iCub humanoid robot to show how language learning emerges. Initially the robot can only babble and perceives speech as a string of sounds, not divided up into words.
The iCub robot named DeeChee learning basic language with Professor Chrystopher Nehaniv and Dr Joe Saunders.
Dr Caroline Lyon said: “It is known that infants are sensitive to the frequency of sounds in speech, and these experiments show how this sensitivity can be modelled and contribute to the learning of word forms by a robot.”
The iTalk project teaches the robot to speak using methods similar to those used to teach children and is a key part in the learning process of the human-robot interaction. Although the iCub robot is learning to produce word forms, it does not know their meaning, and learning meanings is another part of the iTalk project’s research. These scientific and technological advances could have a significant impact on the future generation of interactive robotic systems.
The iCub is the humanoid robot developed at IIT as part of the EU project RobotCub and subsequently adopted by more than 20 laboratories worldwide. It has 53 motors that move the head, arms & hands, waist, and legs. It can see and hear, it has the sense of proprioception (body configuration) and movement (using accelerometers and gyroscopes). Scientists are working to improve on this in order to give the iCub the sense of touch and to grade how much force it exerts on the environment.
Contacts and sources:
University of Hertfordshire