Home Top Stories Questions are growing about the safety of Tesla’s “Full Self-Driving” system

Questions are growing about the safety of Tesla’s “Full Self-Driving” system

0
Questions are growing about the safety of Tesla’s “Full Self-Driving” system

Tesla recall triggered by Autopilot crashes


Tesla recalls more than 2 million vehicles to fix safety issue with Autopilot feature

02:28

William Stein, a technology analyst at Truist Securities, has accepted Elon Musk’s invitation three times in the past four months to test out the latest versions of Tesla’s vaunted “Full Self-Driving” system.

A Tesla equipped with the technology, the company says, can travel from point to point with little human intervention. Yet every time Stein drove one of the cars, he made unsafe or illegal maneuvers, he said. His most recent test drive earlier this month, Stein said, left his 16-year-old son, who was accompanying him, “terrified.”

Stein’s experiences, along with a Tesla crash in Seattle featuring Full Self-Driving that killed a motorcyclist in April, have drawn the attention of federal regulators, who have been investigating Tesla’s automated driving systems for more than two years after dozens of crashes raised safety concerns.

The problems have made those who keep an eye on autonomous vehicles more skeptical that Tesla’s automated system can ever operate safely on a large scale. Stein says he doubts Tesla will even come close to deploying a fleet of autonomous robotaxis next year, as Musk has predicted.

The latest incidents come at a critical time for Tesla. Musk has told investors that it’s possible that full self-driving could operate more safely than human drivers by the end of this year, or perhaps next year.

And in less than two months, the company will unveil a vehicle purpose-built as a robotaxi. To get Tesla robotaxis on the road, Musk has said the company will have to show regulators that the system can drive more safely than humans. Under federal rules, the Teslas would have to meet national vehicle safety standards.

Musk has released data showing how many miles were driven per crash, but only for Tesla’s less-advanced Autopilot system. Safety experts say the data is invalid because it only counts serious crashes with airbag deployment and doesn’t show how often human drivers had to take over to avoid a collision.

Full Self-Driving is used on public roads by about 500,000 Tesla owners — just over one in five Teslas now in use. Most of them paid $8,000 or more for the optional system.

The company has warned that cars equipped with the system cannot drive themselves and that drivers should be ready to intervene at all times if necessary. Tesla also says it is monitoring each driver’s behavior and will suspend their ability to use Full Self-Driving if they fail to properly monitor the system. The company recently began calling the system “Full Self-Driving” (Supervised).

Musk has acknowledged that his previous predictions about the use of autonomous driving have been overly optimistic. In 2019, he promised a fleet of autonomous vehicles by the end of 2020. Five years later, many who follow the technology doubt whether it can work across the U.S. as promised.

“It’s not a given and it’s not going to happen next year,” said Michael Brooks, executive director of the Center for Auto Safety.

The car Stein was driving was a Tesla Model 3, which he picked up at a Tesla showroom in Westchester County, north of New York City. The car, Tesla’s least expensive vehicle, was equipped with the latest Full Self-Driving software. Musk says the software now uses artificial intelligence to help control the steering and pedals.

During his drive, Stein said, the Tesla felt smooth and more human than previous versions. But during a drive of less than 10 miles, he said, the car made a left turn from a through lane while running a red light.

“That was amazing,” Stein said.

He said he didn’t take control of the car because there was little traffic and the maneuver didn’t seem dangerous at the time. Later, however, the car went down the center of a parkway, across two lanes of traffic going in the same direction. This time, Stein said, he intervened.

The latest version of Full Self-Driving, Stein wrote to investors, “does not solve for autonomy” as Musk predicted. It also “doesn’t seem to come close to robotaxi capabilities.” During two previous test drives he took, in April and July, Stein said Tesla vehicles also surprised him with unsafe maneuvers.

Tesla did not respond to requests for comment.

Stein said he thinks Tesla will eventually make money from its ride-hailing technology, but he doesn’t foresee a robotaxi without a driver and a passenger in the backseat coming anytime soon. He predicted it would be significantly delayed or limited in where it can travel.

Stein stressed that there is often a big difference between what Musk says and what is likely to happen.

Sure, many Tesla fans have posted videos on social media of their cars driving themselves without humans taking control. Of course, the videos don’t show how the system performs over time. Others have posted videos showing dangerous behavior.

Alain Kornhauser, head of autonomous vehicle research at Princeton University, said he drove a Tesla borrowed from a friend for two weeks and found the car consistently noticed pedestrians and other drivers.

But while it performs well most of the time, Kornhauser said he had to take control when the Tesla made moves that scared him. He cautions that Full Self-Driving isn’t ready to be left unattended in all locations.

“This thing,” he said, “isn’t at a point where it can go anywhere.”

Kornhauser said he thinks the system could operate autonomously in smaller areas of a city where detailed maps help guide the vehicles. He wonders why Musk doesn’t start offering rides on a smaller scale.

“People could really use the mobility that this provides,” he said.

For years, experts have warned that Tesla’s system of cameras and computers can’t always spot objects and figure out what they are. Cameras can’t always see in bad weather and darkness. Most other autonomous robotaxi companies, such as Alphabet Inc.’s Waymo and General Motors’ Cruise, combine cameras with radar and laser sensors.

“If you can’t see the world correctly, you can’t plan, move and execute the world correctly,” said Missy Cummings, a professor of engineering and computing at George Mason University. “Cars can’t do it with vision alone,” she said.

Even cars with lasers and radar can’t always drive reliably, Cummings said, raising safety concerns about Waymo and Cruise. (Representatives for Waymo and Cruise declined to comment.)

Phil Koopman, a Carnegie Mellon University professor who researches self-driving car safety, says it will be many years before self-driving cars powered solely by artificial intelligence can handle all real-world situations.

“Machine learning has no common sense and learns in a limited way from a large number of examples,” Koopman said. “If the computer operator gets into a situation that it has not been trained for, it is likely to crash.”

Last April in Snohomish County, Washington, near Seattle, a motorcyclist was killed by a Tesla using Full Self-Driving, authorities said. The Tesla driver, who has not yet been charged, told authorities he was using Full Self-Driving while looking at his phone when the car struck the motorcyclist from behind. The motorcyclist was pronounced dead at the scene, authorities said.

The agency said it is reviewing information about Tesla’s fatal crash and law enforcement officials. It also said it is aware of Stein’s experience with Full Self-Driving.

NHTSA also noted that it is investigating whether a Tesla recall earlier this year that was intended to strengthen the automated vehicle driver monitoring system was actually successful. It also urged Tesla to recall Full Self-Driving in 2023 because in “certain rare circumstances,” the agency said, it could violate certain traffic laws, increasing the risk of a crash. (The agency declined to say whether it has completed its review or whether the recall has accomplished its mission.)

As Tesla’s electric vehicle sales have slowed in recent months despite price cuts, Musk has told investors to view the company more as a robotics and artificial intelligence company than a car company. Yet Tesla has been working on full self-driving since 2015.

“I would urge anyone who doesn’t believe Tesla is going to find the solution for autonomous vehicles not to buy Tesla stock,” he said during an earnings conference call last month.

However, Stein told investors they would have to decide for themselves whether Full Self-Driving, Tesla’s artificial intelligence project “with the most history, that is currently generating revenue and is already being used in the real world, actually works.”

NO COMMENTS

LEAVE A REPLY

Please enter your comment!
Please enter your name here

Exit mobile version