Volume 1, Issue 5
Navigating a 3D World
Robots can help people to navigate a 3D environment. Think of situations where a person might have difficulty navigating through their environment. These could be due to the dangerous nature of the environment (such as a disaster zone), a remote location (such as Mars), or because of a physical limitation (such as a sports injury or visual impairment). With recent advancements, robots can be designed to provide assistance in those areas. Not only can robots get into and out of those difficult spaces, but they can become, in essence, a personal fetch-and-carry robot.
Check out this feature on Professor Howard to learn what ignited her interest in robots
How it's done! Computer Engineers and Scientists at the Georgia Institute of Technology have created a system where specialized cameras capture images of the environment. The system then constructs a 3D model of the environment that can be expressed and controlled in a multitude of ways: visual, audio, or haptic. Haptic techniques provide touch-based sensing and control of the environment. Visual images of the world are converted into haptic representations to allow a user to feel their environment without being there.
The mobile manipulation robotic system in a real world environment.
Haptic Exploration with Mobile Manipulator (HEMM) system with a simulated environment.
The system includes a robot that navigates the real-world environment and gathers spatial characteristics of the environment with sensors and conveys that information to the user. The human operator gains environmental perception through feedback forces that are transferred through the stylus while controlling the robot. For example, the stylus stiffens as they approach a barrier. The operator can control the robot to avoid collision with any obstacles in the world or choose to just explore and feel the environment. The combination of the 3-D map and the haptic representations enable people to manipulate the robot in a more fluid and smoother fashion. Further, people with visual impairments can use the haptic feedback to create mental maps that can be applied in the real world, thus increasing familiarity of environments for venturing out on their own.
Professor Ayanna Howard with her invention the SnoMote.
Who thinks of this stuff? Professor Ayanna Howard founded the Human-Automation Systems (HumAnS) Laboratory at the Georgia Institute of Technology. She grew up in California, attended Brown University and received her doctorate from the University of Southern California. Dr. Howard’s research spans the field of robotics, from using robots to help people with disabilities to bringing robots to glaciers to help detect climate change. Much of her research and outreach efforts are funded by the National Science Foundation. In her spare time, Dr. Howard enjoys watching science-fiction movies, teaching aerobic exercise classes, and traveling with her family.
In celebration of Black History Month, please take time to learn about other inspirational African American computer scientists! The Coalition to Diversify Computing (CDC) created a brochure about African American Female Computer Scientists available at: http://www.cdc-computing.org/programs/current-programs/womenofcolor/
To learn more about Professor Howard, visit:
To learn more about virtual maps for the blind, go to:
To learn more about physical activity and games for visually impaired persons, visit: