Industry news

Apparently Self-Driving Cars Can Be Hacked By Stickers

This information comes off a lot scarier than it actually is, especialy since the article never mentions in detail which cars the researchers used for testing, but regardless, it goes to show that in terms of autonomous driving systems, there are some pretty simple ways to fudge with the car’s sensors.

A number of researchers from the University of Washington, University of Michigan, Stony Brook University, and UC Berkeley have figured out a way to hack self driving cars by putting stickers in a variety of patterns on street signs.

I want to again point out that there is no mention of what car they used, so don’t think that this test was done in a Tesla Model S or anything. Chances are it was a researcher created “car” for the purpose of this experiment, so you have to bear in mind that actual cars on the road with autonomous technology have more sophisticated systems. If anything, this experiment is simply to raise a point.

In a research paper titled “Robust Physical-World Attacks on Machine Learning Models,” researchers demonstrated four ways to disrupt an autonomous car’s sensors using nothing more than a color printer and a camera.

A number of methods were used to disrupt the sensors, including putting up a full sized sign to cover the stop sign completely. This caused the sensor to classify the stop sign as a speed limit sign 100% of the time.

Another method involved putting up a number of stickers to spell out words like “love” and “hate,” which caused the sensor to read a speed limit sign two-thirds of the time, and once as a “yield” sign. A third method involved placing stickers in an “abstract art” pattern with just a few small, strategically placed stickers, which ended up having the same effect as the full poster cover up.

In order for these “attacks” to work, potential hackers must know the algorithm the car’s sensor system uses to recognize road signs, but again, many self-driving cars today have numerous sensors and wouldn’t be able to be tricked so easily.

Tarek El-Gaaly, senior research scientist at autonomous driving startup Voyage, says that there are a number of solutions for these hacks, including incoporating some sort of context system so a car knows not to go highway speeds in a residential area if a fake sig is deployed.

“In addition, many self-driving vehicles today are equipped with multiple sensors, so failsafes can be built in using multiple cameras and lidar sensors.”

Tony Hsieh

Cars, the Buffalo Bills, video games, comics, sandwiches, jelly beans, and the shooting star press; these are the things that Tony loves (in addition to his family, of course). When he's not spending his time writing tech reviews for theslanted.com, Tony puts his lifetime love of muscle cars to use on his 2015 Mustang GT. Tony's top three favorite cars are the 1973 Mustang Mach 1, Ferrari 458, and Aston Martin DBS.

Related Articles

Back to top button