An NSF-funded research team at Princeton University has explored how adversaries could cause AI systems to produce unintended, possibly dangerous outcomes.
Credit: National Science Foundation/Karson Productions
Ahead of the game.
I'm Bob Karson with the Discovery Files, from the National Science Foundation.
Machine learning and artificial intelligence (AI) are helping to shape a world of (Sound effect: car drives by) autonomous vehicles, (Sound effect: heartbeat) superior medical diagnostics -- a whole assortment of emerging technologies. (Sound effect: robotic arm sound) They may be smart machines, but they can still fall victim to smart attacks.
Engineers at Princeton have found all that learning ability leaves systems vulnerable to hackers in unexpected ways. They've done some experimental hacking of their own -- invading -- e'-vading doing a little data poisoning.
The team was able to cause a car's AI to perceive a speed limit sign as a stop sign. (Sound effect: tire screech) Same with a fake restaurant sign, through tiny modifications people might not even notice.
Seems attacks could cause AI systems to produce unintended and even dangerous outcomes by corrupting the learning process.
The research found that hackers could attack by inserting bogus information into the stream of data a system is using to learn. Or manipulating the inputs the system receives once it starts applying its learning to real-world decisions. Or connecting the dots to identify private personal details.
The researchers say we're just at the starting point for securing machine learning.
Artificial intelligence -- real vulnerabilities.
"The discovery files" covers projects funded by the government's National Science Foundation. Federally sponsored research -- brought to you, by you! Learn more at nsf.gov or on our podcast.
Images and other media in the National Science Foundation Multimedia Gallery are available for use in print and electronic material by NSF employees, members of the media, university staff, teachers and the general public. All media in the gallery are intended for personal, educational and nonprofit/non-commercial use only.
Images credited to the National Science Foundation, a federal agency, are in the public domain. The images were created by employees of the United States Government as part of their official duties or prepared by contractors as "works for hire" for NSF. You may freely use NSF-credited images and, at your discretion, credit NSF with a "Courtesy: National Science Foundation" notation.
Additional information about general usage can be found in Conditions.