The National Highway Traffic Safety Administration (NHTSA) is investigating a fatal crash involving a Tesla Model S with its Autopilot system engaged in May. The car occupant’s death is the first involving a self-driving car. The debate over who is responsible for the car accident could impact the future of both the technology and plaintiff’s attorneys.

There were an estimated 35,200 people killed in motor vehicle accidents in 2015, so auto accidents have unfortunately been a well-litigated area of the law (ordinarily). But the driver assistance technology installed in the Tesla makes it a case of first impression that will be watched closely by many personal injury lawyers. The family of the victim has hired an attorney, so we’ll see whether the case is resolved or a court struggles with the issues introduced by driverless cars over the next few years.

The impact of this new technology on the law has been briefly debated in the media already as it is an entirely foreseeable problem. Over a year ago (April 3, 2015, in fact), U.S. News & World Report asked the question, “Will Tesla’s drivers be accountable for autonomous car crashes – or will Tesla?” Indeed, Volvo, Mercedes-Benz and Google have reportedly answered this question and they will accept full liability for collisions caused by their self-driving cars when in autonomous mode.

But the impact may extend to the legal profession also. Self-driving cars are expected to eliminate millions of American jobs, and lawyers may be one of the professions impacted. Some believe that the cars will decrease the number of accidents by eliminating human error and distracted driving. Others have disagreed, calling the technology “fertile ground for lawyers”, according to an article in Bloomberg, because it opens up more companies to potential liability for accidents than the driver and insurance company.

It wasn’t long ago (February 2016) that we were talking about the first accident caused by a self-driving car, in which a Google car collided with a bus on video by changing lanes. At that time, it seemed like widespread use of the technology was still much farther down the pipeline with every system still in testing. Now, autonomous driving technology is still in testing but it is on the road in the hands of consumers. There has been a fatality and the law will begin to apportion fault.

The Accident

The accident at issue involved an Ohio resident operating a Tesla in Florida. Although the federal government and many states have yet to weigh-in on driverless cars, Florida passed a law permitting the operation of autonomous vehicles on public roads by individuals with a valid driver license. It is one of a handful of states to have already enacted laws to regulate or permit the operation of these types of motor vehicles.

The Tesla was operating as expected on a divided highway. A white tractor-trailer coming from the opposite direction made a left turn in front of the Tesla. As it was crossing the highway, the Tesla Autopilot system failed to detect it and slow down or swerve to avoid the collision. The operator of the vehicle did not take control of the vehicle and the car hit the truck.

Some of the Legal Issues

1. Insurance

The operator had normal vehicle coverage and did not have any special insurance to operate a driverless vehicle. Insurance companies, like state and federal governments, are still confronting the question of how to operate in a world with driverless cars. Should there be special insurance to deal with the risks of these cars and their new software? Will they deny claims if the operator was not driving? We expect insurance companies and regulators are working on these issues with additional vigor now.

2. Beta Testing

The Tesla Autopilot was in beta testing. The beta testing label has been used by software companies to designate technology that is still in development but ready for testing outside of the company. Tesla started rolling out the beta test of its driver’s assistance system back in August of 2015.

Google, for example, has taken a different approach. Google has been developing its vehicles for seven years now and they have logged more than 1.5 million miles of testing. Yet, they haven’t released a car for consumers because they believe that drivers easily become distracted and the car must be fully autonomous in order to be safe.

Following the accident, there have been several articles in the media questioning the public beta concept.

3. Warnings

The Tesla reportedly warned drivers to remain alert and be prepared to take over control of the vehicle every time they engage the Autopilot. The manual for the Model S contained numerous warnings about relying on the technology. It warned about situations where the Traffic-Aware Cruise Control would not detect a vehicle. It warned about bright light interfering with the camera’s view (the system apparently failed to detect the white trailer against the bright sky). It also indicated that as the system may not slow down.

With warnings such as these, the question will be whether they are legally adequate to avoid a consumer product injury lawsuit. Did the driver truly appreciate that there might be insufficient time to react in the case of a mistake by the autonomous system? In contrast to the warnings issued, Fortune criticized the automaker for mixed messages regarding the “hands on” nature of the technology after founder Elon Musk has promoted operators using the car without paying attention.

4. Who is at fault?

The truck which turned in front of oncoming traffic would usually be considered at fault in the case of an accident. But if the Tesla could have stopped or avoided the collision, is is the driver of the other vehicle still responsible? Is the person who failed to take control of the automated system now to blame? What about the car manufacturer that released a beta system on to the nation’s roadways? We will see how these questions are answered now and in the future.

Safety Updates and Recalls

Tesla has the ability to push software updates wirelessly to the vehicles. Over the next few weeks, it is expected to release Autopilot 8.0 to car owners. The ability to fix consumer problems faster than the ordinary safety recall system could itself change the game on liability. Would Tesla be liable if it was aware of a problem in the system but had not yet given it to consumers because it was still testing the solution?

Which is Safer?

In the regulatory discussions with the NHTSA, Google has argued that humans should not be permitted to override the computer system because it could be detrimental to human safety. If that is true, then one could argue it should lead to the absolution of the software system from liability because it is inherently less risky than the driver behind the wheel.

Other Questions: DUI, Multiple Vehicles

Will the government eventually relax laws regarding driving while intoxicated if the operator is no longer actually driving the vehicle? If a vehicle with a drunk driver gets in an accident, will they be liable?

Will it even be possible to determine fault when two driverless cars collide? Aren’t both cars going to be built to avoid accidents and thus shouldn’t bot vehicles be able to take measures to prevent the crash?

Federal Regulations

The NHTSA was expected to announce guidelines on self-driving cars later this month. The government is expected to facilitate their development because they are expected to ultimately make travel safer. Yet, the transition between the current system and the future without drivers could be risky. And the widespread adoption of the technology could impact many federal and state laws designed in a world where humans control their vehicle.

Stay Tuned

There seems little doubt that this technology could change the world. Because of it, the systems will be under tremendous scrutiny as they are rolled out to the nation’s roadways. And the rules, both in terms of the regulatory standards applied to cars as well as the system of compensating victims of car crashes, will likely be rewritten. As this is the first serious accident, it could set the precedent for years to come.

Our firm is following self-driving cars both from the personal injury and whistleblower angles. As our trial attorneys handle personal injury lawsuits involving catastrophic car crashes, its an area that will ultimately impact (either positively or negatively) a portion of their practice.

Also, it is a matter of interest to our whistleblower attorneys because of the new auto whistleblower law offering industry employees rewards for information about delayed safety recalls. If auto manufacturers put unsafe vehicles on the road and are aware of their dangers without recalling them, they could be liable for monetary sanctions. This could be a fertile area for whistleblowing in the future given the uncertainty now over regulatory measures.

Photo Credit.