Tesla’s autopilot is an advanced driver assistance system designed to reduce the overall workload of its driver. Every Tesla car today provides advanced autopilot features and full self-driving capabilities in the future, through a series of software updates. Tesla boldly claims that every one of its cars will have full self-driving capabilities in the future in almost all circumstances and at a safety level that is at least twice as good as the average human driver.
Risks involved in Tesla’s autopilot
However, Tesla’s autopilot has drawn criticism and has been under federal investigation due to a string of accidents that may have been caused by the system. As of now, the autopilot function can keep a moving car in its lane and can match the speed of surrounding vehicles. Tesla has said that the autopilot system must be used only under certain conditions but some safety experts say the company does not do enough to educate drivers about these limitations or make sure that drivers do not become overly reliant on the system and thus, distracted.
Accidents involving Tesla’s autopilot
Autopilot is likely to blame for a 2018 crash in California in which the driver died. Findings determine that the system failed to keep the car in its lane and failed to detect a highway barrier while travelling at 71 miles per hour, the speed limit being 65 mph. The driver was playing a game on his phone at the time. The first known fatal crash with autopilot in use occurred in May 2016 in Florida, when a Tesla failed to stop for a truck that was turning in front of it on the highway. The vehicle hit the trailer, continued traveling underneath it and veered off the road.
What’s the need of the hour?
It is clear that Tesla needs to develop an application that senses the driver’s level of engagement and issues warnings when the driver’s hands are off the wheel. The computer security company McAfee released findings that a Tesla using the intelligent cruise control feature could be tricked into speeding by placing a small strip of electric tape onto speed limit signs. Research has also shown that placing stickers on road signs could coax cars into dangerously switching lanes while the autopilot system is engaged. Drivers misunderstand the capabilities of Tesla’s autopilot system and the Tesla website currently paints a confusing picture of its cars capabilities. In 2018, Elon Musk was widely criticized for taking his hands off the steering wheel while demonstrating the self-driving capabilities of a Tesla Model 3, something the vehicle owner’s manual instructs drivers using autopilot never to do. By calling its system ‘autopilot’ and by using terms like ‘full self-driving,’ Tesla is misleading consumers as to the capabilities of the technology. Unlike autopilot, Super Cruise, a driver-assistance system offered by General Motors, works only on certain highways and tracks drivers’ heads to make sure they are paying attention to the road.
What do you do if you’re involved in an accident?
If a driver suffered injuries in a crash involving a self-driving Tesla, he is owed compensation for medical bills, pain and suffering and funeral expenses. A car accident attorney will be able to help. Car injury law firms have the resources to take on a large company like Tesla and can make difficult legal matters much easier for its clients. The number of self-driving Teslas in New York is growing every day as it is one of the biggest markets in the US.
The best car injury attorney at The Law Office of Siler and Ingber is just a phone call away and is experienced in tackling cases involving self-driving cars in New York City.
If you or someone you know has been injured in an accident involving Tesla’s autopilot and need a winning law firm, contact Siler &Ingber today. Call us on 1-877-529-4343 or complete our online form on this page to schedule a case evaluation with one of our experienced car accident attorneys. Our consultation is free and we do not charge a fee unless we win your case.