Torts of the future: Self-driving cars
Most new cars have some components that are considered “self- driving” because they regulate the way the vehicle travels and helps to avoid accidents, but the technology is not yet perfect and American laws have yet to catch up.
Who is to blame if a self-driving vehicle is involved in an accident? Several test cases are working their way through the courts, and the results could have significant impact on tort law, which is the aspect of legal jurisprudence that deals with civil fault.
Among the test cases in the courts:
- A woman was killed by a self-driving Uber vehicle in Tempe, Arizona which did not discern her and stop before colliding with her walking a bicycle across a street at dusk. The National Transportation Safety Board’s investigation found that the self-driving vehicle had detected the pedestrian, Elaine Herzberg, several seconds before the crash but that the automatic emergency braking systems had been disabled to reduce jerkiness. In March 2019 local courts found no criminal liability for the accident, but that does not preclude an appeal or a suit for civil penalties, which is the field of tort law.
- A Tesla Model 3 was involved in a crash on a highway in south Florida when its autonomous driving feature did not detect and avoid an accident with a truck in its path. The auto-driving feature had only been turned on several seconds before impact.
- In a similar Florida crash, a car in self-driving mode also impacted with a truck that was turning across its travel lane, killing the Tesla driver. In both Tesla-related accidents it was observed that the driver might have been able to avoid the crash if the vehicle were not in self-driving mode and if the driver were alert to the trucks’ movements ahead.
- In three separate accidents, Tesla vehicles traveling in self-driving mode slammed into parked fire trucks. Rather than the accidents being an issue of fire trucks invisibility, it turns out the accidents reveal a glitch in the self-driving mode: a stationary vehicle in the travel lane often cannot be detected, particularly if the self-driving car is behind a moving vehicle that leaves the lane to avoid the parked vehicle.
In all of these accidents, the investigation is of particular importance because the reasons for these accidents are the basis for whether a suit can be filed against the manufacturer for civil damages. One case, in the Dominican Republic in 2015, occurred when a Volvo owner was demonstrating the self-parking feature of his car and it slammed into those watching. It turns out that the car was not equipped with pedestrian avoidance technology because the owner opted not to spend the extra money for the feature.
Manufacturer vs. user error
Tort law is based on finding fault and awarding damages accordingly. But the self-driving vehicle is a new arena for tort law. Can drivers be found at fault for trusting manufacturer’s promises about vehicle safety features?
In the past, there were many instances of manufacturer negligence in vehicle accidents. Airbags that were defective were found to explode, sending shrapnel into the driver's and passenger’s faces; some vehicles were plagued with unexplained sudden acceleration, and some, like the Ford Pinto, were simply manufactured in such a way that they were hazardous, as low-speed rear-end accidents caused the gas tank to explode.
Self-driving vehicles, however, rely on the interface between an attentive driver and highly-technical components that read oncoming obstacles and rate them for crash avoidance. These vehicles are capable of slowing the car in heavy traffic to maintain a safe following distance, and to adjust for things like rain and slick roads.
One issue that’s been found several times in accidents involving self-driving vehicles is that the operator was warned ahead of time that he should be more attentive (the car knows if the driver’s hands are not on the wheel, poised to take over) by sounding warnings and flashing signals. In the case of the pedestrian fatality by the Uber vehicle, the vehicle was monitored by a human being who might have disabled the braking system because it was inconvenient. People simply do not internalize the need to act quickly and with appropriate care when so warned because we are complacent about our roles in accident liability, which is exacerbated by the new self driving vehicle features that do much of the thinking for us.
There are lots of compromises made on the road to changing the way we drive. One part of the issue is the tipping point for vehicle manufacturers: the financial loss associated with manufacturing negligence that begins to harm profits. Many are willing to overlook a defect for a period of time despite loss of lives because repairing it would be more costly than the associated lawsuits, like the autopilot technology that can detect and respond to moving objects near the car but not parked fire trucks. Unfortunately the figure also represents human lives and abilities.
Further clouding the legal landscape is the fact that the city of Tempe, Arizona was hit with a $10 million lawsuit for making the area where Elaine Herzberg was hit inviting for pedestrians to cross without the assistance of a crosswalk that the self-driving Uber van might have flagged. Indeed, the state governor invited the autonomous vehicle industry to use Arizona’s streets as its test kitchen, forever changing the long-established dynamic of human-to-human judgement between pedestrians and vehicles.
Self-driving and autonomous vehicles are promised to be a life-saver in future years, eliminating many accidents that are based on driver error and negligence such as operating under the influence and distracted driving. Approximately 37,000 Americans die in auto accidents each year, and lowering that figure by even 10 percent is deemed statistically significant, even if a few lives are sacrificed in refining the technology that will achieve the change.