Enable javascript in your browser for better experience. Need to know to enable it? Go here.
Blogs Banner

How NAO robots can help children understand emotion

Autistic children can experience difficulties in relating to others and understanding emotions. Now, help is at hand, thanks to a mask-wearing, kid-friendly robot.

This is the work of London-based Thoughtworker, Lina Alagrami — who explored this topic at XConf EU.

Having worked with NAO robots — the commercially available teaching robots from SoftBank for three years, after initially experimenting with them at Queen Mary University of London. Lina began working on a project to create an interactive application for an NAO robot that could work with autistic children to help them interpret different emotions.

According to studies by the UK’s NHS, autistic children can struggle to relate to other kids, as they find it difficult to understand the feelings and emotions displayed by others. This makes it hard for them to start or join in conversations. By minimizing the human element and introducing games with NAO robots into a learning programme, children are less likely to concentrate on physical interaction and instead focus on the learning and education that the robot provides.

Lina’s project, The Mask of eNAOtion, is based on the creation of a method to detect how different emotions are expressed. Lina’s aim was to help young children understand what the different emotions are and how to express them, to aid their interaction with other children.

Based on a ‘Simon Says’ style game, the NAO robot asks the child to copy a specified emotion (the prototype version of the game includes four emotions; happy, sad, angry and shocked), then checks to see if the child has successfully copied it. If not, the robot asks the child to try again until the emotion is successfully copied. When the child is successful, the NAO robot congratulates them and moves on to the next emotion.

Lina’s methodology for enabling the emotion detection was to create an LED mask (The Mask of eNAOtion) which was fitted to the front of the robot. The mask is made up of a matrix of LEDs spread across three PCBs. An integrated circuit controls the 40 LEDs which make up the ‘face’ on the robot’s mask. The integrated circuit can be controlled to set the LEDs using microcontrollers (such as an Arduino or a Raspberry PI), or USB to Serial modules, which were the preferred method for this project. The USB to Serial module allows the NAO robot to send specific commands to the integrated circuit, telling it which LEDs to turn on or off, creating the happy, sad, angry and shocked faces.

Q&A with Lina Algrami

What was the reaction when you tried out The Mask of eNAOtion game with children for the first time?

As the mask is a prototype it's not really that friendly looking; I tried it with my younger siblings, ages three and ten (both are not autistic).

At first, the three-year-old was confused as to why the robot was wearing the mask, but after explaining it's used to display different emotions as the robot doesn't really have a face, she moved past it and continued to play the game.

When displaying the different emotions, she found it easy to get happy, angry and shocked but struggled with sad. For her, sad meant crying, not frowning — and that's what the robot was displaying. I had to explain she needed to copy what the robot was showing on the mask. The ten-year-old had no problem playing the game, but being surrounded by his family when it came to displaying these emotions, he'd laugh and so the robot would ask him to try again. Once again, sad was one that was tricky.

If you were doing this project again, what would you do differently?

The first thing I would do differently is the mask design. The NAO robot has two cameras (top and bottom), for this game, the bottom one had to be disabled as it's covered by the mouth PCB. For better emotion detection, using both cameras would be better, so I'd like to explore different mask designs that'll allow enabling the bottom camera. I'd also look into wireless communication between the robot and the LEDs — maybe using Wi-Fi or sound recognition (from the robot) to control the mask.

What was the biggest learning?

I specifically chose this project because it combines both hardware and software. So I'd say the whole thing was full of learning, from really digging into the different USB to Serial devices and the different protocols, all the way to emotion detection and finding the perfect threshold for different emotions. But if I had to narrow to just one learning, it would definitely be don't trust hardware — one loose wire can ruin everything!
 

Disclaimer: The statements and opinions expressed in this article are those of the author(s) and do not necessarily reflect the positions of Thoughtworks.

Keep up to date with our latest insights