Self-driving cars pose crucial question

Who to blame in a crash?
Image: Patrick T Fallon/Bloomberg

A debate over who to blame — or sue — when a self-driven car hits someone is holding up legislation the industry says it needs to advance.

“If another driver hits you, it’s clear who the driver is,” Sarah Rooney, senior director of federal and regulatory affairs for the American Association for Justice, said. “It’s the human being.”

Not so when a fully self-driving car hits another vehicle or a pedestrian. Then the fault may lie with the manufacturer and the software, or with the owner if updates have not been properly installed. And if the manufacturer is as fault, a victim may seek to sue under product liability standards, as with a conventional car.

The vehicles are still in the beta stage but the issues have held up legislation that would allow carmakers to test and sell tens of thousands of autonomous vehicles, something the industry says it needs to fully develop and eventually market the technology to consumers. A bill to do that sailed through the House several years ago but has been bogged down in the Senate over the liability question.

A move to merge the bill with must-pass legislation earlier this month faltered over an initiative by some manufacturers to include language that would prevent consumers from suing or forming class-action cases. Instead, the consumers would have to submit disputes to binding arbitration, something that is common with technology products but not automobiles.

That idea faced resistance from safety groups and trial lawyers, who are influential among Senate Democrats. The measure was pulled on the eve of a committee vote and supporters say they are still working to address the liability issue in the hopes of moving the legislation forward this year.

Uber crash

A handful of crashes involving Tesla Inc. vehicles with human drivers utilising the company’s Autopilot system, as well as the death of a pedestrian struck by an Uber Technologies Inc. self-driving test car in 2018, has focused attention on the issue of liability on vehicles now under development that have no steering wheels, gas pedals or other accommodation for human drivers.

Rooney, whose group represents trial lawyers who oppose limits on lawsuits, said liability issues will have to be worked out before any legislation authorizing the use of more automated vehicles on US roads should proceed.

A group of 15 consumer advocacy groups including Rooney’s association wrote a letter on May 17 to leaders on the House Energy and Commerce’s consumer protection subcommittee opposing mandatory arbitration. They expressed concern that automated vehicles may someday be operated by Uber or other companies and come with clauses to their terms of service.

“Unless legislation prevents manufacturers from doing so, they will insert extremely broad forced arbitration clauses into their contracts, blocking consumers from meaningful remedies if they are hurt or their privacy violated,” the letter read. “Even pedestrians’ claims could be kept out of court.”

South Dakota Republican Senator John Thune has proposed legislation calling for the National Highway Traffic Safety Administration to exempt as many as 15 000 self-driving vehicles per manufacturer from human-driving safety standards.

The number would rise to 80 000 within three years. Currently, an automaker can produce 2 500 of the vehicles for testing.

Thune’s bill

“Providing the automotive industry with the tools they need to safely test and deploy automated vehicles across the nation will create thousands of jobs and generate billions of dollars in investment,” Thune, who is a member of the Senate Commerce, Science and Transportation Committee, said in a statement.

The measure doesn’t address the arbitration issue, something Thune says should be considered separately. But opponents of the measure fear granting the exemptions before liability issues are worked out will give carmakers carte blanche to put thousands of self-driving cars on the road before the legal rules are set.

Jason Levine, executive director of Center for Auto Safety, said the threat of litigation has served consumers as an important check on auto manufacturers for decades.

“Too often the most effective counterweight to vehicle defects and manufacturers prioritising profits over safety has not been the federal government, but instead has been the threat of litigation by crash victims,” he said.

The issue is even more important with the advent of self-driving car technology, Levine said.

Tech companies

“Consumers are right to wonder who will be held responsible for the defective computer code which operates a motor vehicle they are in when it kills or maims another human being,” he said. “Who will be held responsible for preventable tragedies, whether it a passenger, or vehicle owner, or the manufacturer, or an individual software engineer will be determined by choices made by Congress.”

Rooney said the friction about liability lies in the fact that self-driving cars are attracting interest from technology companies that do not have long histories in the auto industry. Traditional automakers know “if you manufacture a faulty ignition switch, you’re going to be held accountable,”she said.

“The tech companies that are interested in getting into the space have been allowed to use forced arbitration at will,” she said. “They do not want to get rid of it. If this was traditional auto manufacturers, this debate would probably be over and there would be a bill.”

Self-driving car supporters have argued that existing tort law contains principles for allocating fault and apportioning liability among parties.

“Decades of motor vehicle law have been applied to countless new technologies in the past and have already been applied to AVs,” Ariel Wolf, general counsel to the Self-Driving Coalition, which represents companies such as Ford Motor Co., Uber Technologies Inc., Lyft Inc. and Waymo LLC, said.

Missy Cummings, director of Duke University’s Humans and Autonomy Lab, said Congress should not rush to add new regulations for self-driving cars — or reduce them — while the technology is still in development.

“I think the experimental exemptions that we have are fine,” she said.

Many automakers have quietly backed off pronouncements made in the middle of the last decade that would have resulted in many more self-driving cars being on the road, she said.

“Without explicitly saying it, a lot of companies are realizing that self-driving cars are much further off than we initially realised,” she said.

Wolf said despite the intense debate over legal issues, self-driving cars have the potential to drastically reduce the number of car crashes and deaths on US roads.

“With an estimated 36 000 lives lost on US roads last year, autonomous vehicles offer a transformative opportunity to save lives, unlock new economic and mobility opportunities, and promote American leadership and innovation,” he said.

© 2021 Bloomberg


Sort by:
  • Oldest first
  • Newest first
  • Top voted

You must be signed in and an Insider Gold subscriber to comment.


LOL — The quest for zero risk is a never ending project and if the Senate is going to try and wait then the time will be immeasurable. The industry which develop these technologies will just give up in the US and move to countries that are not run by lawyers and lawyer wannbees.

I would like to add a further ethical problem: The car has AI which connects directly to its insurance company; In that split second of a potential crash, the “car” computes the following: “Will I crash into the car on the left insured by the same company, or will I crash into the little old lady on the right, not insured by us?

Self drive are millennial cars – nothing is their fault

I raised exactly this question in a presentation I gave at the 2019 South African Road Safety Summit.

Traffic law relating to drivers pretty much all starts with “No person shall, on a public road…”. No mention of robots.

This is going to be problematic when the first autonomous vehicle crashes and kills a third party in SA, especially if it’s not a Level 5 device. Which “person” is the NPA going to charge? The manufacturer? A dev in Mumbai who wrote a few thousand lines of code on contract? The company that wrote the compiler or libraries? The chip manufacturer if it turns out to be a hardware design error? How will we separate driver error from machine error in court, since the two might appear near-identical?

Will we risk criminalising drivers when a vehicle does something unanticipated, even while they were hands-on? Imagine being a driver and having to defend yourself in a South African court in such a case when a US or European manufacturer refuses to release proprietary source code or engineering studies which could settle the issue. Even if the court compels them, would they? It certainly might be commercially more sensible to walk away from the SA market with its limited sales volumes than to comply.

The government does not inspire confidence that it could regulate vehicle autonomy effectively, nor even that anything at all is being done despite a tsunami of driving assistance technology heading our way in the next half a decade.

After all, the AARTO Act was signed in 1998 and is still not in force a generation later.

End of comments.




Instrument Details  

You do not have any portfolios, please create one here.
You do not have an alert portfolio, please create one here.

Follow us: