Featured Tesla's latest Full-Self Driving beta shows the technology is still very bad

Published on September 17th, 2021 📆 | 5865 Views ⚑

0

Tesla’s latest Full-Self Driving beta shows the technology is still very bad


Text to Speech Voices

One concern with self-driving technology is how it interacts with the cars and pedestrians around it. Human drivers know the unspoken rules of the road about how to interact with others in a way that’s predictable so that other drivers know how to act in concert. Drivers move through an intersection in the order they arrived there, for instance, and maintain space from pedestrians or bicyclists. But Tesla’s fully autonomous software, still in beta, absolutely does not know how to properly behave.

Recently, a driver named Kim Paquette shared a video on YouTube of her Tesla driving through the streets of Providence, Rhode Island with Full Self-Driving (FSD) enabled. The software promises to eventually drive the car entirely on its own, but Tesla has been using a limited number of drivers as guinea pigs to test the technology and provide data as it continues improving FSD. Paquette’s video shows it has a way to go.

Scary AF — As you watch Paquette’s car try and navigate through the narrow, busy streets, it’s almost harrowing how many times she nearly crashes into another car before disengaging FSD and taking over control. Each time she does so, a “bong” tone is emitted, and that tone is emitted constantly throughout her 18-minute recording, as the car jerks around unpredictably, pulling out into intersections and nearly getting T-boned by cars it failed to see past corners.

The software just does weird things. For instance, it stops far back from a stop sign, and then starts to pull out into an intersection before it can see if a car is coming, forcing Paquette to quickly hit the brake. A normal human would inch out to make sure no cars are about to pass through — the Tesla does not. It also does other things, like getting way too close to parked cars on the street. At one point in the video, the car gives space for bikers on the side of the road before suddenly jerking towards them, at which point Paquette again has to intervene. Later it tries navigating around traffic... on a single-lane street, something she says the car attempts often, as it doesn’t always understand when a street is a single lane.

“[The car] seems very confused today,” she comments. “It’s very jerky and unsure.”





Danger to society — The whole video is anxiety-inducing, and makes it even harder to believe CEO Elon Musk when he proclaims that FSD is “amazing” and can nearly drive itself already. Maybe on large, lazy streets in suburban California, but Paquette’s video shows it’s not only bad in a busy city — it’s actually pretty dangerous.

Proponents will point out that all this data feeds back into Tesla’s development of autonomous tech. Each time you correct the car, in theory, that information goes back into the company’s “neural network,” a machine learning program that uses data from all the cars to learn and get smarter. But this video should make people feel uncomfortable being around a Tesla. Another way to think about this is that people are human test subjects when they’re walking around a city and a bunch of wealthy Tesla owners put their vehicles in control to test out and improve FSD.

When Musk first introduced Autopilot, an advanced driver assistance technology, back in 2015, he said that full autonomy would be here within three years. But that timeline has continued to be pushed back along with the rest of the industry. At least the rest of the industry has been testing its technology in much more controlled environments. FSD is likely years away from being usable — maybe more if regulators crack down.

Source link

Tagged with:



Comments are closed.