Global EditionASIA 中文双语Français
Business
Home / Business / Motoring

In self-driving cars, drivers and standards are coming up short

Updated: 2018-04-09 09:58
Share
Share - WeChat
A test driver removes his hands from the steering wheel of a Tesla Model S electric vehicle fitted with self-driving technology. [Photo/Agencies]

Autonomous cars should be required to improve their ability to detect potential hazards and better ways are needed to keep their human drivers ready to assume control, auto safety and technology experts said after fatal crashes involving Uber Technologies Inc and Tesla Inc vehicles in the United States.

"Humans don't have the ability to take over the vehicle as quickly as may be expected" in those situations, said self-driving expert and investor Evangelos Simoudis.

In the Uber crash last month, the ride services company was testing a fully driverless system intended for commercial use when the prototype vehicle struck and killed a woman walking across a road in Arizona, the United States.

Video of the crash, taken from inside the vehicle, shows the driver at the wheel, who appears to be looking down and not at the road. Just before the video stops, the driver looks upwards toward the road and suddenly looks shocked.

In the Tesla incident last month, which involved a car that any consumer can buy, a Model X vehicle was in semi-autonomous Autopilot mode when it crashed, killing its driver. The driver had received earlier warnings to put his hands on the wheel, Tesla said.

Some semi-automated cars, like the Tesla, employ different technologies to help drivers stay in their lane or maintain a certain distance behind the vehicle in front. Those systems rely on alerts - beeping noises or a vibrating steering wheel - to get drivers' attention.

Immature technology

Duke University mechanical engineering professor Missy Cummings said the recent Uber and Tesla crashes show the "technology they are using is immature."

Tesla says its technology is statistically proven to save lives through better driving. In a response to Reuters on Tuesday, Tesla said drivers have a "responsibility to maintain control of the car" whenever they enable Autopilot and need to be ready to respond to "audible and visual cues".

An Uber spokesperson said "safety is our primary concern every step of the way".

A consumer group, Advocates for Highway and Auto Safety, says a bill on self-driving cars now stalled in the US Senate is an opportunity to improve safety, quite different from the bill's original intent to quickly allow testing of self-driving cars without human controls on public roads.

The group has proposed amending the bill, the AV START Act, to set standards for those vehicles, for instance, requiring a "vision test" for automated vehicles to test what their different sensors actually see.

The group believes the bill should also cover semi-automated systems like Tesla's Autopilot - a lower level of technology than what is included in the current proposed legislation.

Other groups have also put forth proposals on self-driving cars, including requiring the vehicles and even semiautomated systems to meet performance targets, greater transparency and data from makers and operators of the vehicles, increased regulatory oversight, and better monitoring of and engagement with human drivers.

Others want to focus on the human driver. In November, Consumer Reports magazine called on automakers for responsible labeling "to help consumers fully understand" their vehicles' autonomous functions.

Jake Fisher, Consumer Reports' head of automotive testing, said human drivers "are bad at paying attention to automation and this technology is not capable of reacting to all types of emergencies.

"It's like being a passenger with a toddler driving the car," he said.

The Massachusetts Institute of Technology is doing tests using semi-automated vehicles including models from Tesla, Volvo, Jaguar Land Rover and General Motors.

The aim is to see how drivers use semi-autonomous technology - some watch the road with their hands above the wheel, others do not - and which warnings get their attention.

"We just don't know enough about how drivers use any of these systems in the wild," said MIT research scientist Bryan Reimer.

Timothy Carone, an autonomous systems expert and professor at Notre Dame University's Mendoza College of Business, said autonomous technology's proponents must "find the right balance so the technology is tested right, but it isn't hampered or halted". "Because in the long run it will save lives," he said.

Reuters

Top
BACK TO THE TOP
English
Copyright 1995 - . All rights reserved. The content (including but not limited to text, photo, multimedia information, etc) published in this site belongs to China Daily Information Co (CDIC). Without written authorization from CDIC, such content shall not be republished or used in any form. Note: Browsers with 1024*768 or higher resolution are suggested for this site.
License for publishing multimedia online 0108263

Registration Number: 130349
FOLLOW US
CLOSE