Testing Emotional Recognition for Autism Spectrum Disorder with the Zeno R50 Robot


This week I had to read a paper from the International Conference of Robotics and Automation (ICRA) from 2015.  Since I’m interested in research around emotional expression in robots a quick search revealed only 1 of the nearly 1,000 papers were relevant and so I found out about an experiment to use a robot to determine the level of emotional recognition that children with autism have.

The paper is available from http://ieeexplore.ieee.org/document/7140059/ for those who are interested.



Salvador, M. J., Silver, S., & Mahoor, M. H. (2015).

An emotion recognition comparative study of autistic and typically-developing children using the zeno robot. In 2015 IEEE International Conference on Robotics and Automation (ICRA) (pp. 6128–6133). IEEE. https://doi.org/10.1109/ICRA.2015.7140059

In this paper we present the results of our recent study on comparing the emotion expression recognition abilities of children diagnosed with high functioning Autism (ASD) with those of typically developing (TD) children through use of a humanoid robot, Zeno. In our study we investigated the effect of incorporating gestures to the emotion expression prediction accuracy of both child groups. Although the idea that ASD individuals suffer from general emotion recognition deficits is widely assumed [1], we found no significant impairment in the general emotion prediction. However, a specific deficit in correctly identifying Fear was found for children with Autism when compared to the TD children. Furthermore, we found that gestures can significantly impact the prediction accuracy of both ASD and TD children in a negative or positive manner depending on the specific expression. Thus, the use of gestures for conveying emotional expressions by a humanoid robot in a social skill therapy setting is relevant. The methodology and experimental protocol are presented and additional discussion of the Zeno R-50 robot used is given.


This paper aims to look at how Socially Assistive Robots (SAR) can be used to help children with autism spectrum disorder (ASD).  It attempts to prove the point that children with ASD are almost as capable as typically developing (TD) children at detecting emotions.  The authors believe that a publishing bias and general assumption have led to many believing that those suffering with ASD have problems when it comes to recognising different emotions.  This study aims to further explore whether ASD children do have a significant problem recognising emotions.

The experiment was conducted using the Zeno R-50 Robot which was released in 2012.  The robot was selected due to its childlike size, the fact that in a previous study mid-functioning children with ASD had shown a positive response to it and that it was designed with facial expressions being one of its unique points.  In addition to an expressive cartoon-like face the robot comes equipped with HD cameras in each motorised eye, two microphones and an on-board computer with network connectivity.  The robot comes with both an animation studio to manage the different emotional expression scenes and an API to allow for the development of control software.

The authors animated facial expressions from a standard list of “facial action units” (AU) and hired actors to record arm gestures to support these facial expressions for six basic emotions (as well as a neutral state).  These recorded arm gestures and AUs were combined in to three animation sets for each of the six emotions.  Each emotion had a 30%, 60% and 100% intensity variant available.  Control software was then created to allow a human operator to playback an emotion, to listen via the microphone, to watch the video and to perform other tasks with the Zeno (such as making it talk or dance).  No lower-body kinematics were programmed so the robot remained seated throughout the experiment.

A protocol was devised for the experiment with parents and a human controller in one room and the child in a room with a robot.  The child would be introduced to the robot, given three rounds of emotion identification challenges in the form of a game with breaks in between and then it was wound up with the opportunity to dance with the robot.  A total of 22 children were used (11 ASD, 11 TD) in the tests.  The ASD group were all classified as high functioning autistic and the TD group had to have no diagnosis of any development or social deficit disorder nor could they have family members that had been diagnosed.

The results supported the author’s original hypothesis – the ASD group and the TD group were very similar when it came to emotional recognition.  The use of gestures and facial expressions combed versus just using facial expressions did lead to some small differences.  The only statistically-significant difference was the ASD group’s reduced recognition of fear when gesturing was added.  This supports a previous paper that identified a problem recognising fear for those in ASD groups.

I enjoyed this paper and found it an interesting look at how even a simple robot could be used to help those with mental health problems, however I did find some issues with how the experiment was executed and the authors’ interpretations of results.  This experiment only included a sample size of 11 from each group – far too small to draw a definitive conclusion.  Of those in the ASD group two children also refused to participate without their parents being in the room – this is 18% of participants in the ASD group having a different setup to those in the TD group and as such should be considered when analysing the results.  There were also limitations with the equipment used such as the robot not being able to model certain facial expressions, lower-body kinematics not having been modelled and the actors (whose body gestures were captured) being given leading instructions that potentially tainted some of the emotional expressions.  I believe some aspects of the analysis of results (such as using significant changes in the detection of disgust in both a positive and negative way depending on the context) could have done with less of a bias.

Despite these issues I still find the paper relevant.  It backs up existing research in a new way and does identify a lot of its own flaws in the discussion and conclusion sections – furthermore it openly sets out to prove a point from the beginning and does not try to hide its intentions.  The experiment itself is not just useful for a psychiatric perspective when it comes to those on the autistic spectrum but is also relevant for those wanting to use the Zeno R-50 as this is the first paper to be published discussing both its limitations and use-cases.  Stylistically I found this paper to be highly accessible to anyone with a basic understanding of robotics and psychology.


One thought on “Testing Emotional Recognition for Autism Spectrum Disorder with the Zeno R50 Robot

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s