Un­freeze Your Brain

Can you use these items to at­tach the can­dle to the wall and light it? Step one is to jet­ti­son tired think­ing pat­terns, which de­rail even the most in­tel­li­gent among us.

Reader's Digest - - Contents - by leonard mlodi­now from the book elas­tic

In life, once on a path, we tend to fol­low it, for bet­ter or worse. What’s sad is that even if it’s the lat­ter, we of­ten ac­cept it any­way be­cause we are so ac­cus­tomed to the way things are that we don’t even rec­og­nize that they could be dif­fer­ent.

This is a phe­nom­e­non psy­chol­o­gists call func­tional fixed­ness. This clas­sic ex­per­i­ment will give you an idea of how it works—and a sense of whether you may have fallen into the same trap: Peo­ple are given a box of tacks and some matches and asked to find a way to at­tach a can­dle to a wall so that it burns prop­erly.

Typ­i­cally, the sub­jects try tack­ing the can­dle to the wall or light­ing it to af­fix it with melted wax. The psy­chol­o­gists had, of course, ar­ranged it so that nei­ther of these ob­vi­ous ap­proaches would work. The tacks are too short, and the paraf­fin doesn’t bind to the wall. So how can you ac­com­plish the task?

The suc­cess­ful tech­nique is to use the tack box as a can­dle­holder. You empty it, tack it to the wall, and stand the can­dle in­side it.

To think of that, you have to look be­yond the box’s usual role as a re­cep­ta­cle just for tacks and reimag­ine it serv­ing an en­tirely new pur­pose. That is dif­fi­cult be­cause we all suf­fer—to one de­gree or an­other—from func­tional fixed­ness.

The in­abil­ity to think in new ways af­fects peo­ple in ev­ery cor­ner of so­ci­ety. The po­lit­i­cal the­o­rist Han­nah Arendt coined the phrase frozen thoughts to de­scribe deeply held ideas that we no longer ques­tion but should. In Arendt’s eyes, the com­pla­cent re­liance on such ac­cepted “truths” also made peo­ple blind to ideas that didn’t fit their world­view, even when there was am­ple ev­i­dence for them. Frozen think­ing has noth­ing to do with in­tel­li­gence, she said. “It can be found in highly in­tel­li­gent peo­ple.”

Arendt was par­tic­u­larly in­ter­ested in the ori­gins of evil, and she con­sid­ered crit­i­cal think­ing to be a moral im­per­a­tive—in its ab­sence, a so­ci­ety could go the way of Nazi Ger­many.

An­other con­text in which frozen think­ing can turn truly dan­ger­ous is medicine. If you land in the hos­pi­tal, it’s nat­u­ral to want to be treated by the most ex­pe­ri­enced physi­cians on staff. But ac­cord­ing to a 2014 study in the Jour­nal of the Amer­i­can Med­i­cal As­so­ci­a­tion (JAMA), you’d be bet­ter off be­ing treated by the rel­a­tive novices.

The study ex­am­ined nearly ten years of data in­volv­ing tens of thou­sands of hos­pi­tal ad­mis­sions and found that the 30-day mor­tal­ity rate among high-risk pa­tients with acute heart con­di­tions was a third lower when the


top doc­tors were away at con­fer­ences.

The JAMA study didn’t pin­point the rea­sons for the de­creased death rate, but the au­thors ex­plained that most er­rors made by doc­tors are con­nected to a ten­dency to form opin­ions quickly, based on ex­pe­ri­ence. In cases that are not rou­tine, the ex­pert doc­tors may miss im­por­tant as­pects of the prob­lem that are not con­sis­tent with their ini­tial anal­y­sis. As a re­sult, although ju­nior doc­tors may be slower and less con­fi­dent in treat­ing run-of-the-mill cases, they can be more open-minded with un­usual cases.

For­tu­nately, psy­chol­o­gists have found that any­one can un­freeze his or her think­ing. One of the most ef­fec­tive ways is to in­tro­duce a lit­tle dis­cord to one’s in­tel­lec­tual in­ter­ac­tions.

Con­sider a study per­formed about

half a cen­tury ago. The re­searcher showed two groups of fe­male vol­un­teers a se­quence of blue slides. In both groups, he asked each in­di­vid­ual to state the color of each slide. In the ex­per­i­men­tal group, he had planted some ac­tors who called the color green rather than blue. Whom were they fool­ing? No­body. The ex­per­i­men­tal sub­jects ig­nored the deviant re­sponses. When their turns came, most of them an­swered blue, just as the con­trol group had.

Then the sub­jects were asked to clas­sify a se­ries of paint chips as ei­ther green or blue, even though their color lay be­tween those two pure col­ors. Amaz­ingly, the peo­ple who’d been in the ex­per­i­men­tal group iden­ti­fied many chips as green while those from the con­trol group called the same ones blue. Even though no one in the ex­per­i­men­tal group had been con­vinced by the ac­tors be­fore, their ex­po­sure to the ear­lier misiden­ti­fi­ca­tion had shifted their judg­ment and made them more open to see­ing a color as green.

Other ex­per­i­ments have shown that dis­sent can not only sway us with re­gard to the is­sue at hand; it can also thaw frozen think­ing in gen­eral, even in con­texts un­re­lated to the orig­i­nal dis­cus­sion. What this all means is that, as dif­fi­cult as it can some­times be, talk­ing to peo­ple who dis­agree with you is good for your brain. So if you hate con­spir­acy the­o­ries and run into some­one who be­lieves that we faked the moon land­ing, don’t walk away. Have tea with him or her. It can broaden your think­ing in count­less ways.


Ex­cerpted from Elas­tic: flex­i­ble think­ing in a time of change by leonard mlodi­now, copy­right © 2018 by leonard mlodi­now. reprinted with per­mis­sion from pan­theon books, an im­print of pen­guin ran­dom house llc.

Pho­to­graphs by Matthew Co­hen

Newspapers in English

Newspapers from USA

© PressReader. All rights reserved.