- What is The Potential Dangers of Self-Driving Cars?
- Vulnerable to Being Hacked
- Various Accidents Due to the Combination of Self-Driving and Standard Driven Cars
- The Looming Threat of Software Glitches
What is The Potential Dangers of Self-Driving Cars?
Self-driving cars are not yet fully realized, but it's only a matter of time before they eventually become the norm. Yet, while the benefits that autonomous vehicles can offer have been explained ad nauseam, their potential dangers have not.
Of course, as self-driving technology is developed and continually improved, many of these potential dangers will lessen in severity, but as of now, here are the top five dangers of self-driving vehicles:
Vulnerable to Being Hacked
A driverless or semi-driverless car relies heavily on sophisticated computer software that's built right in for it to operate. With internet connectivity and GPS systems being a major component of driverless technology, this leaves it vulnerable to being hacked. If a hacker is able to access the vehicle's software remotely, this could lead to:
- Access to personal data being compromised, including home addresses and credit card info if the car is connected via Bluetooth to a smartphone;
- Intentional sabotage of the vehicle, including manipulating the brakes and speed to cause an accident;
- Locking the car and shutting off all functions so that robbers can break in;
- Manipulating multiple autonomous or semi-autonomous vehicles to create a traffic jam or a mass traffic incident;
- Spreading software viruses.
These are just some of the things that hackers and cyber-criminals may attempt to do.
Various Accidents Due to the Combination of Self-Driving and Standard Driven Cars
Driverless vehicles are designed to, in theory, avoid accidents. In fact, one of the the touted advantages of them is that they're supposed to lead to an overall decrease in road fatalities and avoidable collisions. But if and when driverless vehicles enter the roadways in large numbers, there will essentially be two very different types of drivers on the road, with one being human-driven and the other relying on AI.
There will be growing pains and the possibility of accidents occurring that normally wouldn't. One example is the case in which a 2015 Tesla Model S crashed into a tractor-trailer on a highway in Florida while using its Autopilot system.
While a Tesla employee was partially driving the vehicle for prolonged stretches, Tesla claimed that its car's camera was not able to distinguish between the white color of the tractor-trailer and the bright sky, leading to it failing to break when the trailer crossed in front of it. (See the source). However, government regulators stated that it was due to the driver failing to follow proper autopilot protocol.
Nonetheless, issues like these have to be ironed out. One solution is to require that all individuals with a valid license and those applying for one in the future take a required driver's education course in self-driving technology.
The Looming Threat of Software Glitches
It's rare to have any form of technology that doesn't have a hiccup every now and then. Fully-autonomous vehicles are nowhere near ready yet, which is why regulators won't allow them to operate without a driver or without the driver having some control. Some of the software issues and glitches that have to be worked out are:
- Being able to operate under all weather conditions, including extreme weather that might hinder the various sensors and cameras on the vehicle;
- Avoiding and stopping pedestrians that jaywalk;
- Dealing with unpredictable behavior of other motorists;
- Not freezing up or shutting down while in full autopilot mode.
In time, these issues will most likely be worked out, but for now, caution should be exercised.