Help

BI’s Article search uses Boolean search capabilities. If you are not familiar with these principles, here are some quick tips.

To search specifically for more than one word, put the search term in quotation marks. For example, “workers compensation”. This will limit your search to that combination of words.

To search for a combination of terms, use quotations and the & symbol. For example, “hurricane” & “loss”.

Login Register Subscribe

Auto industry lawyers warn on automated driving hype

Reprints
Auto industry lawyers warn on automated driving hype

(Reuters) — Deadly crashes involving Tesla Inc. and Uber Technologies Inc. vehicles operating entirely or in part under automated systems have made a once-abstract problem very real for auto industry lawyers gathered at a recent conference.

It is crucial that companies accurately outline limitations of automatic driving systems and the circumstances in which they can and cannot take over steering, braking and lane-keeping, attorneys for the U.S. units of carmakers including Hyundai Motor Co., Toyota Motor Corp. and Volkswagen A.G. and supplier company Continental A.G. said at a recent legal conference.

“The OEMs (carmakers) right now are trying really hard to accurately describe what this equipment can do and can’t do,” said Tom Vanderford, associate general counsel at Hyundai, at a conference last Friday in Phoenix attended by Reuters.

The American Bar Association conference took place just a few miles from the scene of a fatal accident involving an Uber test vehicle in autonomous mode. The crash cranked up pressure on the self-driving vehicle industry to prove its software and sensors are safe.

On Thursday, Tesla came under more pressure from regulators and consumer groups for its response to a deadly crash in California of a Tesla vehicle operating on Autopilot, the name the company uses for an enhanced cruise control system.

Fully automated vehicles are not expected to become available to consumers for several years. However, increasingly sophisticated driver assistance features are already sold under various names, carrying disclaimers warning that drivers cannot rely on the systems to safely operate the car in all circumstances.

Such technologies include collision avoidance systems that steer the car away from pedestrians. There are also automatic braking systems and audio and visual alerts when cameras, radar or other sensors detect obstacles.

David Cades, a human factor scientist at engineering consulting firm Exponent, said terminology matters in descriptions of these systems because people might misuse or misunderstand the technology.

Mr. Cades, who has testified in automotive cases as an expert witness, said automakers should not use the term “collision avoidance system.” Instead, he urged manufacturers to use terms such as “collision mitigation systems.”

“Even in naming and marketing these systems, care needs to be taken in how they are promoted,” Mr. Cades said.

John Gersch, managing counsel at Toyota and another conference participant, pointed to a promotional video shown at the conference, in which Tesla touts its Autopilot semi-automated system.

“The overreliance issue is probably the most serious issue with all these systems, so that goes with Tesla that was shown there,” said Mr. Gersch.

Even though its marketing materials feature the automatic system, Tesla’s owners manual tells buyers that they are required to keep their hands on the wheel at all times before activating the system.

The family of a driver killed last month in a Tesla car crash has hired law firm Minami Tamaki L.L.P. to explore legal options, the firm said on Wednesday, adding the Autopilot feature was defective and probably caused his death. Tesla in a statement said Walter Huang, the victim in the California crash, “was well aware that Autopilot was not perfect.” Tesla added that Mr. Huang told his family that Autopilot “was not reliable in that exact location” of the crash. It said he took his hands off the wheel “several times” before the crash.

Investigating a 2016 fatal crash, the National Transportation Safety Board last year said Autopilot lacked safeguards, giving too much leeway to the driver to divert attention. The NTSB in a statement on Thursday urged Tesla to act on the safety recommendations in that report.

Following the 2016 accident, Tesla introduced more frequent warnings to drivers to keep their hands on the wheel. After three warnings, the software now blocks Autopilot until the driver stops and restarts.

“Tesla warns, but in products liability warnings don’t protect you against design defect claims,” said University of South Carolina law professor Bryant Walker Smith, who focuses on automated driving.

Car accident litigation usually turns on a driver’s alleged negligence. By contrast, a lawsuit involving automated technology could scrutinize whether the system had a design defect.

Mr. Smith said plaintiffs suing Tesla could point to alternative designs, such as General Motors Co.’s semi-autonomous Super Cruise system which tracks eye movement to monitor whether a driver pays attention.

GM says Super Cruise relies on a predefined map and only allows hands-off driving on designated highways and only when the driver is paying attention to the road.

Any litigation arising from accidents involving fully or partially automated vehicles could also pit software suppliers against vehicle manufacturers.

Industry lawyers said carmakers increasingly indemnify smaller technology companies working on self-driving features.

But Tammy Fanning, deputy general counsel for parts maker Continental, which manufactures radar for blind spot detection systems, told the conference the general automotive liability chain will not change.

“From a product liability perspective, it will all stay the same: Suppliers will be paying for everything in the end, every recall, all replacements,” Ms. Fanning said.

 

 

 

Read Next