Are Self-Driving Cars Safe?
Self-driving cars are relatively new to the market, but some estimates show there could be as many as 10 million self-driving vehicles on the road by 2020.
Though these vehicles are marketed as being safer than the vehicles we’ve driven for years, there have already been two deaths linked to Tesla’s self-driving technology.
Despite these cars potentially reducing some specific risks associated with driving, the lack of a human component means there is nobody operating the vehicle who has the ability to reason.
Many companies, including Tesla, Ford, Google, and Uber are working to develop the self-driving technology. There are already more than 100 self-driving cars in the test market in California.
Companies are testing fully autonomous cars that make it possible for a vehicle to operate without any human intervention at all. These vehicles will eventually lack pedals and steering wheels and will offer no “override” feature to allow passengers in the vehicles to take over for the self-driving technology.
According to manufacturers, this is to avoid “human error.”
Unfortunately for self-driving vehicle enthusiasts, it is this human component that actually makes driving safer.
Human drivers have the ability to reason when they are driving. Self-driving cars are able to identify objects around the vehicle but don’t always have the ability to react in an appropriate manner based on the circumstances.
There are also ethical concerns that come into play when a vehicle is self-driving.
For instance, engineers are looking at the question of whether a self-driving car should be programmed to prioritize a child pedestrian’s safety over that of those in the vehicle.
We make choices all the time when driving. Self-driving cars would be making these choices for us.
Tesla Already Facing Safety Challenges
Tesla is arguably at the forefront of self-driving technology. And as a result of its aggressive approach, it has already encountered a number of issues.
In 2016, a self-driving Tesla failed to identify and react to a semi-truck due to glare and the truck’s height. Tesla noted it’s entirely possible for the technology to miss a significant object because of the angle and the material of which it’s made. In addition to missing large objects, there’s also concern self-driving cars could overreact to small objects.
Imagine the damage that could be done if a self-driving car slammed on its breaks because it misread a small pebble in the road.
Two Fatal Accidents Linked to Self-Driving Vehicles
There is also concern that the existing Tesla technology has led to accidents. While the company works hard to improve upcoming tech, some Tesla owners believe their vehicle’s autopilot feature provided them with a false sense of safety while driving. Essentially, drivers are letting their attention slack off because they feel as if they can.
This has potentially led to two fatal accidents so far.
Tesla’s Model S autopilot is suspected to have played a role in an accident in China at the beginning of 2016, though there is no official word yet on whether the autopilot feature was engaged at the time of the crash.
Later in the year, a second fatal accident occurred when a Tesla crashed into an 18-wheeler. In that instance, the autopilot feature was active but did not detect the truck because of glare and the height of the vehicle.
Other accidents have been linked to the Tesla technology that wasn’t fatal. A Tesla Model X in California crashed into the back of a semi-truck because the driver knew autopilot was engaged and failed to notice the truck swerve into his lane.
Self-driving cars might seem like the next wave for a safer driving experience, but not everyone believes this is an accurate portrayal of the technology.