The Personal Robots Group at MIT has developed a socially assistive robot that can interpret the emotional response of the student it is working with and, based on those cues, create a personalized motivational strategy.

Testing the setup in a preschool classroom, the researchers showed that the automaton could learn and improve itself in response to the unique characteristics of the students it worked with. It proved to be more effective at increasing students' positive attitude toward the robot and activity than a non-personalized robot assistant.

A furry, brightly colored robot, "Tega" was developed specifically to enable long-term interactions with children. It uses an Android device to process movement, perception and thinking and can respond appropriately to children's behaviors.

A child plays an interactive language-learning game with Tega, a socially assistive robot. Image credit: Personal Robots Group/MIT Media Lab.A child plays an interactive language-learning game with Tega, a socially assistive robot. Image credit: Personal Robots Group/MIT Media Lab.Unlike previous iterations, Tega is equipped with a second Android phone containing custom software developed by Affectiva Inc. that can interpret the emotional content of facial expressions—a method known as "affective computing."

The researchers piloted the system with 38 students, aged three to five, in a Boston-area school last year. Each student worked individually with Tega for 15 minutes per session over the course of eight weeks.

The students in the trial learned Spanish vocabulary from a tablet computer loaded with a custom-made learning game. Tega served not as a teacher but as a peer learner, encouraging students, providing hints when necessary and even sharing in students' annoyance or boredom when appropriate.

The robot began by mirroring the emotional response of students—becoming excited when they were excited, and distracted when the students lost focus—which educational theory suggests is a successful approach. However, it went further and tracked the impact of each of these cues on the student.

Over time, it learned how the cues influenced a student's engagement, happiness and learning successes. As the sessions continued, it ceased to simply mirror the child's mood and began to personalize its responses in ways that would optimize each student's experience and achievement.

Over the eight weeks, the personalization continued to increase. Compared with a control group that received only the mirroring reaction, students with the personalized response were more engaged by the activity, the researchers concluded.

"We know that learning from peers is an important way that children learn not only skills and knowledge, but also attitudes and approaches to learning such as curiosity and resilience to challenge," says Cynthia Breazeal, director of the Personal Robots Group at the MIT Media Laboratory. "What is so fascinating is that children appear to interact with Tega as a peer-like companion in a way that opens up new opportunities to develop next-generation learning technologies that not only address the cognitive aspects of learning, like learning vocabulary, but the social and affective aspects of learning as well."

The experiment served as a proof of concept for the idea of personalized educational assistive robots and also for the feasibility of using such robots in actual classrooms. The system, which is almost entirely wireless and easy to set up and operate behind a classroom divider, caused very little disruption and was thoroughly embraced by both the student participants and teachers.

The researchers plan to improve upon and test the system in a variety of settings, including with students with learning disabilities, for whom one-on-one interaction and assistance is particularly critical and hard to come by.

"A child who is more curious is able to persevere through frustration, can learn with others and will be a more successful lifelong learner," Breazeal says.

To contact the author of this article, email engineering360editors@globalspec.com