Mental ‘blind spot’ may have affected pilots
Confirmation bias could be to blame for jet nearly landing on busy taxiway
Could the same brain phenomenon identified as contributing to today’s polarizing political climate have played a role in an Air Canada flight crew coming within seconds of landing on a row of jets awaiting takeoff at
SFO? Absolutely, experts say.
The condition, known as confirmation bias, occurs when people accept or seek out evidence that confirms their expectations and ignore or avoid facts that don’t align with their expectations — just like when a Donald Trump or Hilary Clinton supporter tunes out an opposing viewpoint, or contrary facts.
The same mental blind spot likely impacted the Air Canada flight crew on July 7 when it nearly triggered the worst aviation disaster in history by landing on four fully-loaded planes on the
San Francisco International Airport taxiway, says Dr. Andrew Gilbey, a senior lecturer in aviation at Massey University in New Zealand. Gilbey and colleagues have published a number of studies in psychology periodicals on aviation confirmation bias.
Federal officials are continuing to investigate the close-call by the Air Canada jet, which came within 50 feet of ground aircraft, according to flight data analyzed by this news organization and FlightAware.
The pilot, who was heading directly to the taxiway rather than the runway, thought he was in the right place. When he asked about lights he saw on the ground, according to air traffic audio, the tower assured him the runway was clear for him to land, and the pilot continued on his misguided course. As we all know by now, the pilot was not where he thought he was, but why didn't he make his own assessment and correction?
“Unfortunately, people often ignore … (contrary) evidence which could show their prior belief is wrong, and favor confirmatory evidence, which generally cannot show definitively that they are correct,” Gilbey said.
Errors of judgment
Here's how confirmation bias likely played a role during Air Canada flight 759's final approach to the San Francisco airport, according to Gilbey, who has reviewed the audio and flight data analysis for this news organization:
In general, during highstress situations, like a night landing at a busy airport, people are more likely to commit errors of judgment. In this case, the pilot assumed he was lined up on approved Runway 28-Right, when he was really aimed at Taxiway C. And when he told the tower he saw lights on the “runway,” the air traffic controller told him the runway was clear, so the pilot assumed his current flight path had been approved.
“When they were told by a controller that Runway 28-Right was clear, this would have probably provided them with … confirmatory evidence that everything was going as planned,” Gilbey said. “However, there must have been at least three major pieces of disconfirmatory evidence confronting the aircrew, and had they utilized this evidence, they should have realized much earlier that they were lined up on a taxiway as it would have definitely indicated their prior belief … was dangerously wrong.”
As contrary evidence, the pilot should have noticed the different color lights that mark the taxiway. He should have noticed that the very distinct lighting that marks a runway was missing. And given how clear the night was, he should have seen the lights of the aircraft queued on the taxiway — lights that wouldn't have been there if he were headed for the runway, Gilbey said.
But it took a pilot on the ground to warn of the pending collision and the air traffic controller to order a go-around before the Air Canada plane aborted the landing.
In 1980, Edie Fischer was a graduate student at San Jose State University in the psychology department and led a study for NASA Ames Research Center and the American Pilot Association to determine if a new cockpit display would work in commercial airplanes. As part of the experiment conducted in a flight simulator, experienced commercial airline pilots were cleared to land, even though a widebody aircraft had been placed on the runway.
In the study, two of eight pilots in the experiment never saw the runway aircraft, apparently because they already had convinced themselves nothing was in their path.
Fischer said that study provided insight into the Air Canada event, confirmation bias and how little we know about the human mind.
“If you want to use my study results at all, you have to raise the possibility that after all the physical reasons, such as fatigue, vision problems, drugs, etc. are eliminated, the pilot's actions may have resulted from a cognitive malfunction beyond his conscious control,” said Fischer, a retired research psychologist. “The only remedy I can think of is training pilots intensively to expect the unexpected.”
One pilot interviewed in Fischer's study who did not see the airplane on the simulated runway told researchers at the time: “If I didn't see it (the tape), I wouldn't believe it. I honestly didn't see anything on that runway.”
Fischer said pilots need rest and training — “There need to be some serious studies on the cognitive processes of the human being.”
The FAA has been aware of this human factor in aviation errors for years, and the agency's safety team issued a reminder in 2014: “A mental bias can lead to unconscious behavior, and it is difficult to prevent what you don't intend to do. So, work as a team. Two heads are better than one, and many better than two. Try to disprove the decision.”
Capt. Shem Malmquist, a 777 pilot and air safety and accident investigator, has written about aviation confirmation bias. Based on what he knows about the Air Canada incident from media reports only, Malmquist said confirmation bias “might have come into play.”
The “cognitive short-cut” becomes worse at night, and fatigue can exacerbate the situation, he said. The Air Canada flight was a redeye.
“Once the brain has decided on a solution, it takes an awful lot of evidence to shake it. Our perception is actually influenced greatly by what we believe, where some cognitive scientists take it so far as to state that reality is a shared hallucination,” Malmquist said. “Confirmation bias is one of those factors that is very challenging to see or predict prospectively, but extremely obvious in hindsight.”
In 1980, Edie Fischer led a study for NASA that studied confimation bias.