NSF is taking the lead with three other federal government agencies to support the administration's National Robotics Initiative. The initiative supports the development and use of robots that work beside, or cooperatively, with people and that enhance individual human capabilities, performance and safety. Find out more in this news release.
Clifford I. Nass of Stanford University and Robin Murphy of Texas A&M University are exploring ways to make rescue robots more user-friendly by incorporating lessons learned from studies of how humans interact with technology. Rescue robots serve as a trapped disaster victim's lifeline to the outside world. But they are worthless if the victim finds them scary, bossy, out-of-control or just plain creepy. Read more in this Discovery.
Credit: Texas A&M University
A Drexel University-led research team has unveiled the newest, most central member of its collaboration with a team of Korean researchers: Jaemi, a humanoid (HUBO). Jaemi HUBO embodies efforts to advance humanoid development and enhance the concept of human-robotic interaction. Find out more in this news release.
Credit: Lisa-Joy Zgorski, National Science Foundation
The Division of Information and Intelligent Systems (IIS) of the Computer and Information Science and Engineering Directorate studies the inter-related roles of people, computers and information. IIS supports research and education activities that develop new knowledge about the role of people in the design and use of information technology; increase our capability to create, manage, and understand data and information in circumstances ranging from personal computers to globally-distributed systems and advance our understanding of how computational systems can exhibit the hallmarks of intelligence.
Preliminary studies by professor Maja Mataric and doctoral student David Feil-Seifer of the University of Southern California Interaction Laboratory confirm that children with Autism Spectrum Disorders interact more easily with mechanical devices than humans.
Robots are emerging from industrial settings to help humans perform surgery, catch criminals and even fend off the effects of aging. With new capacities for mobility, sensing and intelligence, robots are augmenting human capabilities in completely new ways. Find out more in the NSF special report, Engineers of the New Millennium.
March 5, 2012
Developing Robots That Can Teach Humans
Researchers are programming robot teachers to gaze and gesture like humans
When it comes to communication, sometimes it's our body language that says the most--especially when it comes to our eyes.
"It turns out that gaze tells us all sorts of things about attention, about mental states, about roles in conversations," says Bilge Mutlu, a computer scientist at the University of Wisconsin-Madison.
Mutlu knows a thing or two about the psychology of body language. He bills himself as a human-computer interaction specialist. Support from the National Science Foundation (NSF) is helping Mutlu and his fellow computer scientist, Michael Gleicher, take gaze behavior in humans and create algorithms to reproduce it in robots and animated characters.
"These are behaviors that can be modeled and then designed into robots so that they (the behaviors) can be used on demand by a robot whenever it needs to refer to something and make sure that people understand what it's referring to," explains Mutlu.
Both Mutlu and Gleicher are betting that there will be significant benefits to making robots and animated characters "look" more like humans. "We can build animated agents and robots that can communicate more effectively by using the very subtle cues that people use," says Gleicher.
Mutlu sets up experiments to study the effect of a robot gaze on humans. "We are interested in seeing how referential gaze cues might facilitate collaborative work such that if a robot is giving instructions to people about a task that needs to be completed, how does that gaze facilitate that instruction task and people's understanding of the instruction and the execution of that task," says Mutlu.
To demonstrate, a three-foot-tall, yellow robot in the computer sciences lab greets subjects, saying: "Hi, I'm Wakamaru, nice to meet you. I have a task for you to categorize these objects on the table into boxes."
In one case, the robot very naturally glances toward the objects it "wants" sorted as it speaks. In another case, the robot just stares at the person. Mutlu says the results are pretty clear. "When the robot uses humanlike gaze cues, people are much faster in locating the objects that they have to move."
Another experiment run by Mutlu and Gleicher's team explores how an animated character's eyes affect human learning. A character projected on a screen says to the viewer, "Today, I'll be telling you a story that comes straight from ancient China." Behind the animated character is a map of China that he'll be referring to in the lecture that runs several minutes.
"The goal of the experiment is to see if we could achieve a high-level outcome, like learning, by controlling an animated character's gaze," says Gleicher. "What we found was when the lecturer looked at the map at appropriate times to indicate to the participant that now I'm talking about something on the map, the participant ended up learning more about spatial locations."
The team hopes their work will transform how humanoid robots and animated characters interface with people, especially in classrooms. "We can design technology that really benefits people in learning, in health and in well-being, and in collaborative work," notes Mutlu.
Any opinions, findings, conclusions or recommendations presented in this material are only those of the presenter grantee/researcher, author, or agency employee; and do not necessarily reflect the views of the National Science Foundation.