Who's Responsible When Self-Driving Cars Breaks the Law?
In urban cities, self-driving cars feel more mundane each day. You can be picked up by an Uber that has a greeting through the front screen, or food delivery service that doesn't obligate you to make a contact with a human being at all during the process. However, here is the question: If a self-driving car breaks a traffic rule, who's to blame? The passenger or the car's owner? Should they be fined?
It's a question that sure, has come up in friend groups or while you are driving next to a self-driving car, how driverless and autonomous vehicles are the future. Most auto manufacturers have been working on their version of the driverless auto, from Tesla to Toyota.
Open roads have now become the biggest and most recent step in testing out technology for widespread use.
Who's at fault?
What happens when a self-driving (SD) vehicle hits a pedestrian or another vehicle? Is the driver at fault, even if they were never in the car in the first place? Is it the artificial intelligence (AI) or automated system developer who created the driving software? Is it the auto manufacturer who assembled and supplied the vehicle?
How should we handle insurance? Once you remove the act of driving and give control to a computer or automated system, how do you determine what constitutes as safe versus risky driving?
Furthermore, when is it legal to take your hands completely off the steering wheel and leave the vehicle in control? Should there be restrictions on what you can do in the car? Should you be allowed to browse social media or use your smartphone while the car navigates for you, for example?
If the vehicle's AI has to make a split-second decision between saving your life or the lives of passengers in another car nearby, how should it go about doing that?
You can easily see why many of these questions would give pause to even the most supportive proponents of the technology.
We have barely thought of the components and complications when it comes to this technology. In 2017, a total of 33 states introduced driverless vehicle legislation as opposed to 20 states the year prior. So, it's clearly ramping up.
Today, 29 states have laws that deal with autonomous vehicles and their systems. Federal law is another roadblock, when applying to ruling outside of local and state legislation. National Highway and Transportation Safety Administration released new guidelines for automated driving systems in September. The Senate is also working on introducing autonomous vehicle legislation.
How do lawyers approach this?
This is an unknown territory at this time for practicing lawyers, when adopting driving law when the driver is not involved. Regular practice is responded to be at-risk because of this.
There is a lot to iron out before the autonomous technologies can operate unfettered and worry-free on our roadways. What is the insurance and product liability impact? How will privacy, security and customer data be handled? What happens when a vehicle experiences a data breach or cyberattack?
It is up to the legal world to assist with these policies, regulations and practices, which means step one is about understanding the technology, how it operates and what that means for anyone involved.
The truth is, while we know the technology is coming down the pipeline – and soon – we don't know what that means for the state of the market. How long will it take, for instance, before driverless vehicles make up the majority of cars on the road? When looking at a vehicle a human is driving versus one a computer is operating, how do you determine fault?
The only certainty, at this point, is that we still have a lot of questions to resolve before we can make this technology available to all.
The challenges listed by the panel range from developing ways to facilitate communication with a “responsible human being” who may be controlling the car remotely during a traffic stop or emergency, to intercepting autonomous cars smuggling drugs or people.
- Developing standards for checking documentation of cars on the road when there are no humans to display a driver's license or registration.
- Developing a way to stop and secure autonomous cars that are behaving erratically and endanger other cars or human drivers;
- Providing ways for a traffic police officer to communicate with the artificial intelligence systems driving the vehicle; and
- Developing shared protocols between law enforcement and the artificial intelligence engineers driving this new segment of the automotive industry on methods to ensure traffic safety and adherence to rules of the road.
If the potential safety issues presented by driverless cars sharing the highway with traditional vehicles aren't addressed, there could be a long road of trouble ahead for the police, the panel said.
“As the technology rapidly advances in coming years, law enforcement agencies must accelerate their preparation for a radically altered reality on the roads,” the panel concluded.