Featured no image

Published on April 23rd, 2022 📆 | 4189 Views ⚑

0

The problem with self-driving cars is not technology, the problem is people


text to speech

The prospect of autonomous cars in aiding, even replacing, human drivers, is exciting. Advertised benefits include reduced commuter stress and improved traffic flow. The prospect is also alarming. The growing number of accidents involving self-driving technology tests the risk appetites of even the most enthusiastic adopters. The challenges are real. Uber, an early adopter of self-driving car technology, recentlyĀ abandonedĀ itsĀ ambitionsĀ of full autonomy. The recent $2.5 billionĀ fineĀ against Boeing due to the 737 Max disaster exposes the underlying vulnerabilities associated with the introduction of technology.

There has been ampleĀ reviewĀ of the underlying technology, but there are far too few discussions about the role of people. What happens when we replace human judgment with technology, a situation that psychologists call ā€œcognitive offloadingā€? Cognitive offloading has become more common with the introduction of new technologies. Do you rely on your phone to store phone numbers you once memorized? Do you use GPS navigation instead of memorizing your driving routes? Then you know the benefits of cognitive offloading. Cognitive offloading transfers routine tasks to algorithms and robots and frees up your busy mind to deal with more important activities.

In an upcoming edition of the peer reviewed journal,Ā Human Performance in Extreme Environments, I review the unintended consequences of cognitive offloading in industries like aviation and aerospace. Despite its many benefits, cognitive offloading also introduces a new set of problems. When we offload activities, we also offload learning and judgment. In oneĀ study, researchers asked a group of subjects to navigate the streets of London using their own judgment. A second group relied on GPS technology as their guide. The GPS group saw significantly less activity in the brain associated with learning and judgment. In the instance of self-driving cars, drivers may see their driving skills degrade over time.Ā 

Two primary deficits can accompany cognitive offloading. First, cognitive offloading can lead to forgetfulness or failure to learn even basic operating procedures. The problem becomes acute when equipment fails, when the weather is harsh, and when unexpected situations arise. In aviation, even carefully selected and highly trained pilots can experience these deficits. Pilots failed to perform basic tasks in theĀ Air France 447 disaster.Ā An airspeed sensor failed, and autopilot disengaged. The pilots were now in control of the plane but had never learned, or forgot, how to regain control of the aircraft as it quickly descended into disaster.

Second, cognitive offloading also leads people to overestimate the value of offloading, and this can lead toĀ overconfidence. People may fail to grasp how offloading may degrade their abilities or how it may encourage them to apply new technologies in unintended ways. The result can be consequential. TheĀ Boeing 737 MaxĀ incidents were attributed, in part, to overconfidence in the technology. One pilot evenĀ celebratedĀ that the new technology was so advanced, he could learn to master the newly equipped aircraft by training on a tablet computer. But the technology and engineering proved to be far more complicated to operate. This same type of overconfidence has led to accidents in self-driving cars. Some drivers of self-driving cars have slept at the wheel and others have left their seat completely, despite warnings that the driver should always be aware and engaged when in autodriving mode.

ā€œWhen we offload activities, we also offload learning and judgment.ā€





Commercial aviation offers lessons for ways to address these deficits. Technological innovation has fueled remarkable gains inĀ safety. TheĀ fatality rateĀ in commercial airlines has been cut in half over theĀ last decade. Importantly, implementation of new technology goes hand in hand with extensive training inĀ human factors.Ā Human factors consider the limitations of human decision making, motor skills, and attention. The safe implementation of new technologies requires extensive training and constant updating that helps pilots understand the limits of the technology.

Proposed solutions to the human factor problem in self-driving cars are promising but have yet to reach an acceptable level of transparency.Ā Teslaā€™s Safety Score Beta, for example, monitors the driving habits of Tesla owners and only activates the self-driving feature for drivers who meet their criteria on five factors: number of forward collision warnings, hard breaking, aggressive turning, unsafe following, and forced autopilot engagement. But much of the data lacksĀ transparency, there is no ongoing training, and there is growing discontent among drivers who fail to make the safety cut after shelling out nearly $10,000 for the self-driving feature.

The widespread adoption of self-driving cars will require more than just technology. Extensive human support systems such as oversight and reporting, training, and attention to human limitations must also be addressed. The ultimate success of self-driving cars will depend on improving technology, but also on educating the drivers behind the wheel.

Source link

Tagged with: ā€¢ ā€¢ ā€¢ ā€¢ ā€¢



Comments are closed.