Email Print Share

News Release 16-026

Robot learning companion offers custom-tailored tutoring

New social robot from MIT helps students learn through personalized interactions

child and robot

A child plays an interactive language learning game with Tega, a socially assistive robot.


March 14, 2016

This material is available primarily for archival purposes. Telephone numbers or other contact information may be out of date; please see current contact information at media contacts.

Parents want the best for their children's education and often complain about large class sizes and the lack of individual attention.

Goren Gordon, an artificial intelligence researcher from Tel Aviv University who runs the Curiosity Lab there, is no different.

He and his wife spend as much time as they can with their children, but there are still times when their kids are alone or unsupervised. At those times, they'd like their children to have a companion to learn and play with, Gordon says.

That's the case, even if that companion is a robot.

Working in the Personal Robots Group at MIT, led by Cynthia Breazeal, Gordon was part of a team that developed a socially assistive robot called Tega that is designed to serve as a one-on-one peer learner in or outside of the classroom.

Socially assistive robots for education aren't new, but what makes Tega unique is the fact that it can interpret the emotional response of the student it is working with and, based on those cues, create a personalized motivational strategy.

Testing the setup in a preschool classroom, the researchers showed that the system can learn and improve itself in response to the unique characteristics of the students it worked with. It proved to be more effective at increasing students' positive attitude towards the robot and activity than a non-personalized robot assistant.

The team reported its results at the 30th Association for the Advancement of Artificial Intelligence (AAAI) Conference in Phoenix, Arizona, in February.

Tega is the latest in a line of smartphone-based, socially assistive robots developed in the MIT Media Lab. The work is supported by a five-year, $10 million Expeditions in Computing award from the National Science Foundation (NSF), which support long-term, multi-institutional research in areas with the potential for disruptive impact.

The classroom pilot

The researchers piloted the system with 38 students aged three to five in a Boston-area school last year. Each student worked individually with Tega for 15 minutes per session over the course of eight weeks.

A furry, brightly colored robot, Tega was developed specifically to enable long-term interactions with children. It uses an Android device to process movement, perception and thinking and can respond appropriately to children's behaviors.

Unlike previous iterations, Tega is equipped with a second Android phone containing custom software developed by Affectiva Inc. -- an NSF-supported spin-off of Rosalind Picard of MIT -- that can interpret the emotional content of facial expressions, a method known as "affective computing."

The students in the trial learned Spanish vocabulary from a tablet computer loaded with a custom-made learning game. Tega served not as a teacher but as a peer learner, encouraging students, providing hints when necessary and even sharing in students' annoyance or boredom when appropriate.

The system began by mirroring the emotional response of students ­­– getting excited when they were excited, and distracted when the students lost focus – which educational theory suggests is a successful approach. However, it went further and tracked the impact of each of these cues on the student.

Over time, it learned how the cues influenced a student's engagement, happiness and learning successes. As the sessions continued, it ceased to simply mirror the child's mood and began to personalize its responses in a way that would optimize each student's experience and achievement.

"We started with a very high-quality approach, and what is amazing is that we were able to show that we could do even better," Gordon says.

Over the eight weeks, the personalization continued to increase. Compared with a control group that received only the mirroring reaction, students with the personalized response were more engaged by the activity, the researchers found.

In addition to tracking long-term impacts of the personalization, they also studied immediate changes that a response from Tega elicited from the student. From these before-and-after responses, they learned that some reactions, like a yawn or a sad face, had the effect of lowering the engagement or happiness of the student -- something they had suspected but that had never been studied.

"We know that learning from peers is an important way that children learn not only skills and knowledge, but also attitudes and approaches to learning such as curiosity and resilience to challenge," says Breazeal, associate professor of Media Arts and director of the Personal Robots Group at the MIT Media Laboratory. "What is so fascinating is that children appear to interact with Tega as a peer-like companion in a way that opens up new opportunities to develop next-generation learning technologies that not only address the cognitive aspects of learning, like learning vocabulary, but the social and affective aspects of learning as well."

The experiment served as a proof of concept for the idea of personalized educational assistive robots and also for the feasibility of using such robots in a real classroom. The system, which is almost entirely wireless and easy to set up and operate behind a divider in an active classroom, caused very little disruption and was thoroughly embraced by the student participants and by teachers.

"It was amazing to see," Gordon reports. "After a while the students started hugging it, touching it, making the expression it was making and playing independently with almost no intervention or encouragement."

Though the duration of the experiment was comprehensive, the study showed the personalization process continued to progress even through the eight weeks, suggesting more time would be needed to arrive at an optimal interaction style.

The researchers plan to improve upon and test the system in a variety of settings, including with students with learning disabilities, for whom one-on-one interaction and assistance is particularly critical and hard to come by.

"A child who is more curious is able to persevere through frustration, can learn with others and will be a more successful lifelong learner," Breazeal says. "The development of next-generation learning technologies that can support the cognitive, social and emotive aspects of learning in a highly personalized way is thrilling."

-NSF-

Media Contacts
Aaron Dubrow, NSF, 703-292-4489, email: adubrow@nsf.gov

Principal Investigators
Cythnia Breazeal, MIT, email: cynthiab@media.mit.edu

Co-Investigators
Goren Gordon, Tel Aviv University, email: goren@gorengordon.com

The U.S. National Science Foundation propels the nation forward by advancing fundamental research in all fields of science and engineering. NSF supports research and people by providing facilities, instruments and funding to support their ingenuity and sustain the U.S. as a global leader in research and innovation. With a fiscal year 2023 budget of $9.5 billion, NSF funds reach all 50 states through grants to nearly 2,000 colleges, universities and institutions. Each year, NSF receives more than 40,000 competitive proposals and makes about 11,000 new awards. Those awards include support for cooperative research with industry, Arctic and Antarctic research and operations, and U.S. participation in international scientific efforts.

mail icon Get News Updates by Email 

Connect with us online
NSF website: nsf.gov
NSF News: nsf.gov/news
For News Media: nsf.gov/news/newsroom
Statistics: nsf.gov/statistics/
Awards database: nsf.gov/awardsearch/

Follow us on social
Twitter: twitter.com/NSF
Facebook: facebook.com/US.NSF
Instagram: instagram.com/nsfgov