Self-driving cars couldn’t stand up to these hackers
The thing about cars relying totally on computers is that computers can be hacked. Computers can be hacked, overridden and switched off. It’s obviously a dangerous problem. If a self-driving car is hacked the drivers life is in danger, sadly this has already happened in America. Mashable report on how hackers are testing systems before they go live…
A driver of a Tesla Model S died in early May when both he and his car’s semi-autonomous Autopilot system failed to see a tractor trailer crossing in front of the car on a highway in Florida.
While it seemed like the perfect storm of real-world chance versus automotive high tech, it turns out not just an unlikely blend of coincidence could cause Tesla’s Autopilot to miss an obstacle.
Intriguingly, the researchers, turned white-hat hackers, didn’t actually have to hack the car. They simply jammed the stationary test car’s front-mounted radar, ultrasonic sensors and cameras by exposing them to various machines that emitted light, radio and sound.
As you can see in the video below, Autopilot suddenly and without warning loses track of the car ahead when the radio interferer is activated.
What’s more, the researchers created the same blinding effect by draping a car with acoustic foam, which is far cheaper than devices with five- and six-figure price tags.
Granted, the machines employed by the researchers are expensive — and wrapping something in acoustic foam would be obvious. So, it’s unlikely they would be used by would-be hackers to impair a single car. However, the tests prove that there are more issues with Tesla’s Autopilot system than was initially anticipated.
So there is still plenty of work to be done. Mechanics isn’t what it used to be and the computer facing side of it is really coming through now. To be on board for when it takes it all off, check out our Motor Vehicle courses here today!