Redress Risk Management (post until May 31/19)

McLeish Orlando: Lindsay Charles and Brock Turville, Summer Student

Tesla autopilot, still ironing out the wrinkles

Written By: Lindsay Charles and Brock Turville, Summer Student

Tesla, a car company based in California, is known for its electric-powered cars and innovation in the automobile industry. The company also appears to be one of the leaders in the race to implement a fully autonomous autopilot system in its cars. However, Tesla’s current software is only semi-autonomous, which requires the driver to monitor the vehicle. Collisions involving Tesla’s autopilot system have occurred in recent months, and reliability and safety remain major concerns.

Tesla’s system uses cameras, sensors, and radar to detect nearby cars and objects, which allow the car to automatically steer, change lanes, brake, and park. Apple software engineer Wei Huang was using this system when he died in March after his Tesla collided with a highway median in California. Data recovered from the system indicated that Huang’s hands were off the steering wheel for six seconds before the collision.[1]

On May 29, 2018, Tesla’s autopilot system was engaged when a 65-year-old man crashed into a police SUV in Laguna Beach, California. Fortunately, the man only suffered minor injuries and no officers were in the vehicle at the time of the collision.

In Greece, Tesla’s autopilot system allegedly caused one man’s car to swerve off course at a fork in the road. As a result, the car crashed into a median, but the driver was fortunately unharmed. The motorist voiced his concerns after the incident: “The vigilance required to use the software, such as keeping both hands on the wheel and constantly monitoring the system for malfunctions or abnormal behaviour, arguably requires significantly more attention than just driving the vehicle normally”.[2] Consumer groups in the United States have criticized Tesla’s semi-autonomous autopilot system for misleading consumers since it uses the word “autopilot”, but still requires the driver to be engaged and alert.

As a result of these incidents, Tesla reminded users that they must maintain control of the vehicle and keep their hands on the wheel. In a statement responding to these collisions, a Tesla spokesperson clarified that the system does not make the car “impervious to all accidents”. Tesla further explained that the user must accept a dialogue box before using the system, which states that “autopilot is designed for use on highways that have a center divider and clear lane markings”.[3]

The jury is out on autonomous vehicles. Stay tuned for our next update!

[1] “Tesla car that crashed and killed driver was running on Autopilot, firm says.” The Guardian. March 31, 2018. https://www.theguardian.com/technology/2018/mar/31/tesla-car-crash-autopilot-mountain-view.

[2] “Tesla hit parked police car ‘while using Autopilot.” BBC News. May 30, 2018. https://www.bbc.com/news/technology-44300952.

[3] Solon, Olivia. “Tesla that crashed into police car was in ‘autopilot’ mode, California official says.” The Guardian. https://www.theguardian.com/technology/2018/may/29/tesla-crash-autopilot-california-police-car.

To Read More McLeish Orlando LLP Posts Click Here
Lawyer Directory
BridgePoint Financial Services (post to 5.31.19)Toronto Lawyers Association (post to 6.30.19)MKD International (post until Sept. 30/18)Feldstein Family Law (post until May 31/19)Greystones Health Deadline Law SRH Litigation (post until May 31/19)MacDonald & Associates (post until July 31/19)