A century after Western explorers first crossed the dangerous landscapes of the Arctic and Antarctic, researchers have successfully deployed a self-guided robot that uses ground-penetrating radar to map deadly crevasses hidden in ice-covered terrains. Deployment of the robot--dubbed Yeti--could make Arctic and Antarctic explorations safer by revealing the potentially dangerous fissures buried beneath ice and snow. Read more in this news release.
Credit: James Lever, U.S. Army's Cold Regions Research and Engineering Laboratory
Clifford I. Nass of Stanford University and Robin Murphy of Texas A&M University are exploring ways to make rescue robots more user-friendly by incorporating lessons learned from studies of how humans interact with technology. Rescue robots serve as a trapped disaster victim's lifeline to the outside world. But they are worthless if the victim finds them scary, bossy, out-of-control or just plain creepy. Read more in this discovery.
Credit: Texas A&M University
The Division of Information and Intelligent Systems (IIS) of the Directorate for Computer and Information Science and Engineering studies the inter-related roles of people, computers and information. IIS supports research and education activities that develop new knowledge about the role of people in the design and use of information technology; increase the capability to create, manage, and understand data and information in circumstances ranging from personal computers to globally-distributed systems, and advance an understanding of how computational systems can exhibit the hallmarks of intelligence.
Cornell University engineers have taught a robot to work in a mock-supermarket checkout line, modifying a Baxter robot from Rethink Robotics in Boston to "coactively learn" from humans and make adjustments while an action is in progress.
A robot can struggle to discover objects in its surroundings when it relies on computer vision alone. But by taking advantage of all of the information available to it--an object's location, size, shape and even whether it can be lifted--a robot can continually discover and refine its understanding of objects.
December 16, 2013
New CoBots are "help on wheels"
These collaborative robots learn as they go and don't need a chaperone
Meet CoBot--short for "Collaborative Robot." You might call it "help on wheels." With support from the National Science Foundation (NSF), computer scientist Manuela Veloso and her team at Carnegie Mellon University (CMU) are developing CoBots, autonomous indoor service robots to interact with people and provide help "on the go."
Getting on a CoBot's dance card is simple: log on to a website, select a task, book a time slot--and CoBot is on the job. If one CoBot is too busy, then another will carry out the request. CoBots can transport objects, deliver messages, escort people and go to places, continuously executing these tasks over multiple weeks in a multi-floor building. The robustness of the mobile robot's localization and navigation has permitted it to travel non-accompanied for hundreds of kilometers in a building.
CoBots are able to plan their paths and smoothly navigate autonomously. They monitor the walls, calculate planar surfaces, and plot window and door locations--all while avoiding dynamic obstacles and even making notes about things like where the carpet and hardwood floor meet. Aware of their limitations, CoBots also proactively ask for help from the web or from humans for locations and for assistance with tasks that they cannot do, such as pressing elevator buttons and picking up objects to be carried. And, once a CoBot dialogs with a human requesting a task involving locations, e.g., the CORAL lab, CoBot learns the association between the language used and the location room numbers, e.g., 7412 for the CORAL lab.
Besides Veloso, CoBots team members are: Rodrigo Ventura, professor, Instituto Superior Técnico in Lisbon; Joydeep Biswas, doctoral student, Robotics Institute; Brian Coltin, doctoral student, Robotics Institute; Tom Kollar, postdoctoral fellow, Computer Science Department; Vittorio Perera, doctoral student, Computer Science Department; Mehdi Samadi, doctoral student, Computer Science Department; and Stephanie Rosenthal, doctoral student, Computer Science Department--all at CMU--and visiting doctoral students in CMU's Computer Science Department Robin Soetens, from University of Eindhoven, the Netherlands; and Yichao Sun, from Zheijhang University, China.
The team received advice from Illah Nourbakhsh, professor, CMU's Robotics Institute; Reid Simmons, professor, CMU's Robotics Institute; Alex Rudnicky, professor, CMU's Computer Science Department; Aaron Steinfeld, professor, CMU's Robotics Institute; and Daniele Nardi, visiting professor from University La Sapienza, Rome, Italy.
The research in this episode was supported by NSF award #1218932, Robust Intelligence (RI): Small: Natural Language-Based Human Instruction for Task Embedded Robots, and award #1012733, Human Centered Computing (HCC): Large: SSCI-MISR: Symbiotic, Spatial, Coordinated Human-Robot Interaction for Multiple Indoor Service Robots.
Any opinions, findings, conclusions or recommendations presented in this material are only those of the presenter grantee/researcher, author, or agency employee; and do not necessarily reflect the views of the National Science Foundation.