Men­tal ‘blind spot’ may have af­fected pi­lots

Con­fir­ma­tion bias could be to blame for jet nearly land­ing on busy taxi­way

The Mercury News - - Front Page - By Matthias Gafni mgafni@ba­yare­anews­group.com

Could the same brain phe­nom­e­non iden­ti­fied as con­tribut­ing to to­day’s po­lar­iz­ing po­lit­i­cal cli­mate have played a role in an Air Canada flight crew com­ing within sec­onds of land­ing on a row of jets await­ing take­off at

SFO? Ab­so­lutely, ex­perts say.

The con­di­tion, known as con­fir­ma­tion bias, oc­curs when peo­ple ac­cept or seek out ev­i­dence that con­firms their ex­pec­ta­tions and ig­nore or avoid facts that don’t align with their ex­pec­ta­tions — just like when a Don­ald Trump or Hi­lary Clin­ton sup­porter tunes out an op­pos­ing view­point, or con­trary facts.

The same men­tal blind spot likely im­pacted the Air Canada flight crew on July 7 when it nearly trig­gered the worst avi­a­tion disaster in his­tory by land­ing on four fully-loaded planes on the

San Fran­cisco In­ter­na­tional Air­port taxi­way, says Dr. Andrew Gilbey, a se­nior lec­turer in avi­a­tion at Massey Uni­ver­sity in New Zealand. Gilbey and col­leagues have pub­lished a num­ber of stud­ies in psy­chol­ogy pe­ri­od­i­cals on avi­a­tion con­fir­ma­tion bias.

Fed­eral of­fi­cials are con­tin­u­ing to in­ves­ti­gate the close-call by the Air Canada jet, which came within 50 feet of ground air­craft, ac­cord­ing to flight data an­a­lyzed by this news or­ga­ni­za­tion and FlightAware.

The pi­lot, who was head­ing di­rectly to the taxi­way rather than the run­way, thought he was in the right place. When he asked about lights he saw on the ground, ac­cord­ing to air traf­fic au­dio, the tower as­sured him the run­way was clear for him to land, and the pi­lot con­tin­ued on his mis­guided course. As we all know by now, the pi­lot was not where he thought he was, but why didn't he make his own as­sess­ment and cor­rec­tion?

“Un­for­tu­nately, peo­ple of­ten ig­nore … (con­trary) ev­i­dence which could show their prior be­lief is wrong, and fa­vor con­fir­ma­tory ev­i­dence, which gen­er­ally can­not show defini­tively that they are cor­rect,” Gilbey said.

Er­rors of judg­ment

Here's how con­fir­ma­tion bias likely played a role dur­ing Air Canada flight 759's fi­nal ap­proach to the San Fran­cisco air­port, ac­cord­ing to Gilbey, who has re­viewed the au­dio and flight data anal­y­sis for this news or­ga­ni­za­tion:

In gen­eral, dur­ing high­stress sit­u­a­tions, like a night land­ing at a busy air­port, peo­ple are more likely to com­mit er­rors of judg­ment. In this case, the pi­lot as­sumed he was lined up on ap­proved Run­way 28-Right, when he was re­ally aimed at Taxi­way C. And when he told the tower he saw lights on the “run­way,” the air traf­fic con­troller told him the run­way was clear, so the pi­lot as­sumed his cur­rent flight path had been ap­proved.

“When they were told by a con­troller that Run­way 28-Right was clear, this would have prob­a­bly pro­vided them with … con­fir­ma­tory ev­i­dence that ev­ery­thing was go­ing as planned,” Gilbey said. “How­ever, there must have been at least three ma­jor pieces of dis­con­fir­ma­tory ev­i­dence con­fronting the air­crew, and had they uti­lized this ev­i­dence, they should have re­al­ized much earlier that they were lined up on a taxi­way as it would have def­i­nitely in­di­cated their prior be­lief … was dan­ger­ously wrong.”

As con­trary ev­i­dence, the pi­lot should have no­ticed the dif­fer­ent color lights that mark the taxi­way. He should have no­ticed that the very dis­tinct light­ing that marks a run­way was miss­ing. And given how clear the night was, he should have seen the lights of the air­craft queued on the taxi­way — lights that wouldn't have been there if he were headed for the run­way, Gilbey said.

But it took a pi­lot on the ground to warn of the pend­ing col­li­sion and the air traf­fic con­troller to or­der a go-around be­fore the Air Canada plane aborted the land­ing.

‘Cog­ni­tive mal­func­tion’

In 1980, Edie Fis­cher was a grad­u­ate stu­dent at San Jose State Uni­ver­sity in the psy­chol­ogy depart­ment and led a study for NASA Ames Re­search Cen­ter and the Amer­i­can Pi­lot As­so­ci­a­tion to de­ter­mine if a new cock­pit dis­play would work in com­mer­cial air­planes. As part of the ex­per­i­ment con­ducted in a flight sim­u­la­tor, ex­pe­ri­enced com­mer­cial air­line pi­lots were cleared to land, even though a wide­body air­craft had been placed on the run­way.

In the study, two of eight pi­lots in the ex­per­i­ment never saw the run­way air­craft, ap­par­ently be­cause they al­ready had con­vinced them­selves noth­ing was in their path.

Fis­cher said that study pro­vided in­sight into the Air Canada event, con­fir­ma­tion bias and how lit­tle we know about the hu­man mind.

“If you want to use my study re­sults at all, you have to raise the pos­si­bil­ity that af­ter all the phys­i­cal rea­sons, such as fa­tigue, vi­sion prob­lems, drugs, etc. are elim­i­nated, the pi­lot's ac­tions may have re­sulted from a cog­ni­tive mal­func­tion be­yond his con­scious con­trol,” said Fis­cher, a re­tired re­search psy­chol­o­gist. “The only rem­edy I can think of is train­ing pi­lots in­ten­sively to ex­pect the un­ex­pected.”

One pi­lot in­ter­viewed in Fis­cher's study who did not see the air­plane on the sim­u­lated run­way told re­searchers at the time: “If I didn't see it (the tape), I wouldn't be­lieve it. I hon­estly didn't see any­thing on that run­way.”

Fis­cher said pi­lots need rest and train­ing — “There need to be some se­ri­ous stud­ies on the cog­ni­tive pro­cesses of the hu­man be­ing.”

The FAA has been aware of this hu­man fac­tor in avi­a­tion er­rors for years, and the agency's safety team is­sued a re­minder in 2014: “A men­tal bias can lead to un­con­scious be­hav­ior, and it is dif­fi­cult to pre­vent what you don't in­tend to do. So, work as a team. Two heads are bet­ter than one, and many bet­ter than two. Try to dis­prove the de­ci­sion.”

Capt. Shem Malmquist, a 777 pi­lot and air safety and ac­ci­dent in­ves­ti­ga­tor, has writ­ten about avi­a­tion con­fir­ma­tion bias. Based on what he knows about the Air Canada in­ci­dent from me­dia re­ports only, Malmquist said con­fir­ma­tion bias “might have come into play.”

The “cog­ni­tive short-cut” be­comes worse at night, and fa­tigue can ex­ac­er­bate the sit­u­a­tion, he said. The Air Canada flight was a red­eye.

“Once the brain has de­cided on a so­lu­tion, it takes an aw­ful lot of ev­i­dence to shake it. Our per­cep­tion is ac­tu­ally in­flu­enced greatly by what we be­lieve, where some cog­ni­tive sci­en­tists take it so far as to state that re­al­ity is a shared hal­lu­ci­na­tion,” Malmquist said. “Con­fir­ma­tion bias is one of those fac­tors that is very chal­leng­ing to see or pre­dict prospec­tively, but ex­tremely ob­vi­ous in hind­sight.”

JOSIE LEPE — STAFF PHO­TOG­RA­PHER

In 1980, Edie Fis­cher led a study for NASA that stud­ied con­fi­ma­tion bias.

Newspapers in English

Newspapers from USA

© PressReader. All rights reserved.