Published on September 17th, 2021 📆 | 6995 Views ⚑
0Tesla’s latest Full-Self Driving beta shows the technology is still very bad
One concern with self-driving technology is how it interacts with the cars and pedestrians around it. Human drivers know the unspoken rules of the road about how to interact with others in a way thatâs predictable so that other drivers know how to act in concert. Drivers move through an intersection in the order they arrived there, for instance, and maintain space from pedestrians or bicyclists. But Teslaâs fully autonomous software, still in beta, absolutely does not know how to properly behave.
Recently, a driver named Kim Paquette shared a video on YouTube of her Tesla driving through the streets of Providence, Rhode Island with Full Self-Driving (FSD) enabled. The software promises to eventually drive the car entirely on its own, but Tesla has been using a limited number of drivers as guinea pigs to test the technology and provide data as it continues improving FSD. Paquetteâs video shows it has a way to go.
Scary AF â As you watch Paquetteâs car try and navigate through the narrow, busy streets, itâs almost harrowing how many times she nearly crashes into another car before disengaging FSD and taking over control. Each time she does so, a âbongâ tone is emitted, and that tone is emitted constantly throughout her 18-minute recording, as the car jerks around unpredictably, pulling out into intersections and nearly getting T-boned by cars it failed to see past corners.
The software just does weird things. For instance, it stops far back from a stop sign, and then starts to pull out into an intersection before it can see if a car is coming, forcing Paquette to quickly hit the brake. A normal human would inch out to make sure no cars are about to pass through â the Tesla does not. It also does other things, like getting way too close to parked cars on the street. At one point in the video, the car gives space for bikers on the side of the road before suddenly jerking towards them, at which point Paquette again has to intervene. Later it tries navigating around traffic... on a single-lane street, something she says the car attempts often, as it doesnât always understand when a street is a single lane.
â[The car] seems very confused today,â she comments. âItâs very jerky and unsure.â
Danger to society â The whole video is anxiety-inducing, and makes it even harder to believe CEO Elon Musk when he proclaims that FSD is âamazingâ and can nearly drive itself already. Maybe on large, lazy streets in suburban California, but Paquetteâs video shows itâs not only bad in a busy city â itâs actually pretty dangerous.
Proponents will point out that all this data feeds back into Teslaâs development of autonomous tech. Each time you correct the car, in theory, that information goes back into the companyâs âneural network,â a machine learning program that uses data from all the cars to learn and get smarter. But this video should make people feel uncomfortable being around a Tesla. Another way to think about this is that people are human test subjects when theyâre walking around a city and a bunch of wealthy Tesla owners put their vehicles in control to test out and improve FSD.
When Musk first introduced Autopilot, an advanced driver assistance technology, back in 2015, he said that full autonomy would be here within three years. But that timeline has continued to be pushed back along with the rest of the industry. At least the rest of the industry has been testing its technology in much more controlled environments. FSD is likely years away from being usable â maybe more if regulators crack down.
Gloss