New research published in the journal Biological Psychology suggests that infants, even at age 6 to 8 months old, already have a sense of the implications of eye contact, including that from such robots as those with human features.
While most research has concentrated on how babies interact with humans, the increased use of robots in childcare and education raises questions about how infants perceive these machines. The study highlights the growing importance of understanding how infants interact with technology.
The authors of the report state that, similar to the responsiveness of infants to human interaction, infants respond to robots’ gaze, meaning infants grasp the social significance of gaze and can decode such meaningful behaviours coming from other providers.
There is therefore a need to continue researching the role that robots may play in early socialisation, but further research is needed on the long-term effects of infants interacting with this technology.
“Humanoid robots are becoming increasingly common in social environments, and people are suddenly expected to engage in social interactions with these artificial agents. We are interested in how the human brain understands the ‘sociality’ of artificial humanoid robots,” said study author Samuli Linnunsalo, a doctoral researcher at Tampere University and member of the Human Information Processing Laboratory.
“We believe that, to fully explore people’s instinctive interpretations of humanoid robots’ sociality, it is necessary to use physiological measures to investigate their responses to robots’ nonverbal social cues, such as eye contact. After finding initial evidence that adult humans’ psychophysiological responses to eye contact with a humanoid robot were similar to their responses to eye contact with a human, we sought to investigate whether young infants react similarly to a humanoid robot’s and a human’s eye gaze. This was particularly interesting to us because infants do not have knowledge of the humanoid robots’ purpose as social interaction partners, nor do they understand that people are expected to treat humanoid robots as social agents.”
As per a release, the study involved 114 infants, aged between 6 and 8 months. The researchers invited the infants to a laboratory where they were exposed to three different types of stimuli: a human, a humanoid robot called Nao, and a non-human object, in this case, a vase. The researchers used live stimuli rather than videos or images to make the experience more realistic for the infants.
Each of the human and robot models was presented to the infant either looking directly at them (direct gaze) or looking away (averted gaze). To ensure the infants were engaged, the researchers used a carefully controlled environment with an interactive introduction for both the human and the robot. The robot would introduce itself, mimicking natural social gestures like nodding and hand movements, while the vase served as a non-interactive control object.