Don’t blame ide­ol­ogy: We’re all wired to mis­un­der­stand sci­ence

To seek a more sci­en­tif­i­cally in­formed view of the world, we must rec­og­nize that our cur­rent view is in­ad­e­quate — and such epipha­nies are all too rare.

The Washington Post Sunday - - OUTLOOK - Book re­view by An­nie Mur­phy Paul

Sci­ence is tak­ing it from all sides these days. On the right are those who ques­tion the re­al­ity of cli­mate change and doubt the the­ory of evo­lu­tion. On the left are those who in­veigh against vac­cines and fear ge­net­i­cally mod­i­fied foods. Those who do ac­cept the author­ity of sci­ence watch help­lessly as fund­ing for re­search is threat­ened, all the while be­moan­ing the warp­ing in­flu­ence of po­lit­i­cal ide­ol­ogy on the be­liefs of their com­pa­tri­ots.

Into this sorry state of af­fairs ar­rive two new books, each of which draws on a dif­fer­ent body of re­search to make the same sur­pris­ing claim: that the mis­un­der­stand­ing and de­nial of sci­ence is not driven ex­clu­sively or even pri­mar­ily by ide­ol­ogy. Rather, sci­en­tific ig­no­rance stems from cer­tain built-in fea­tures of the hu­man mind — all of our minds.

Granted, this news is not ex­actly cheer­ing. Re­search in cog­ni­tive sci­ence has re­vealed that “hu­man ca­pac­ity is not all that it seems, that most peo­ple are highly con­strained in how they work and what they can achieve,” write Steven Slo­man and Philip Fern­bach, au­thors of “The Knowledge Il­lu­sion.” Slo­man and Fern­bach ar­gue that “there are se­vere lim­its on how much in­for­ma­tion an in­di­vid­ual can process”

and that “peo­ple of­ten lack skills that can seem ba­sic.” Even more, the au­thors con­tend, it’s un­clear that such skills “can ever be learned.”

And yet this book, along with an­other re­cent ti­tle, “Scienceblind,” of­fers read­ers a few crumbs of hope. If it’s not po­lit­i­cal ide­ol­ogy but rather cog­ni­tive er­rors that pro­duce sci­en­tific il­lit­er­acy, then per­haps we can find ways to fix our men­tal glitches with­out wad­ing into pol­i­tics — that place where good in­ten­tions go to die.

The au­thor of “Scienceblind” is An­drew Sh­tul­man, an as­so­ciate pro­fes­sor of psy­chol­ogy and cog­ni­tive sci­ence at Oc­ci­den­tal Col­lege. His re­search fo­cuses on in­tu­itive the­o­ries, which he calls “our un­tu­tored ex­pla­na­tions for how the world works.” Such ex­pla­na­tions con­sti­tute “our best guess as to why we ob­serve the events we do and how we can in­ter­vene on those events to change them,” Sh­tul­man writes. In­tu­itive the­o­ries help us get by; they work well enough, ac­cord­ing to the crude cal­cu­lus of sur­vival.

The prob­lem is that our in­tu­itive the­o­ries are, sci­en­tif­i­cally speak­ing, of­ten wrong. We sus­pect that we came down with the snif­fles be­cause we were drenched by a cold rain. We sur­mise that the weather is hot­ter in the sum­mer be­cause the Earth is closer to the sun. We em­brace in­tu­itive the­o­ries be­cause, in Sh­tul­man’s words, “we are built to per­ceive the en­vi­ron­ment in ways that are use­ful for daily liv­ing, but these ways do not map onto the true work­ings of na­ture.”

Many of our in­tu­itive the­o­ries are formed early in life, be­fore for­mal sci­ence in­struc­tion takes place. And be­cause all chil­dren en­counter the same phys­i­cal world, in­ter­preted through the use of the same lim­ited bi­o­log­i­cal equipment, they tend to for­mu­late sim­i­lar ideas about how that world works (cre­at­ing a shared so­cial re­al­ity that fur­ther en­trenches in­tu­itive the­o­ries).

Take, for in­stance, a typ­i­cal child’s un­der­stand­ing of heat. Kids com­monly con­ceive of warmth as a prop­erty of a par­tic­u­lar ob­ject: a baked potato is hot, while an ice cube is not. Their vis­ual and tac­tile senses give them no hint that heat is ac­tu­ally a gen­eral prop­erty of mat­ter, pro­duced by mol­e­cules in mo­tion as they rub against one an­other. “We do, of course, teach chil­dren a molec­u­lar the­ory of mat­ter, but not un­til they have reached mid­dle school, and by that time, they have al­ready con­structed an in­tu­itive the­ory of heat,” Sh­tul­man ob­serves. Start­ing sci­ence ed­u­ca­tion ear­lier wouldn’t work, he con­tin­ues, be­cause “chil­dren lack the con­cepts needed to en­code the sci­en­tific in­for­ma­tion we might teach them.” More­over, stud­ies con­ducted with very young in­fants “sug­gest that many of our ex­pec­ta­tions about mo­tion and mat­ter are in­nate.”

Even af­ter years of ed­u­ca­tion, our ex­pec­ta­tions about the phys­i­cal world — those we’re born with and those we de­velop very early in our lives — con­tinue to ex­ert a tena­cious hold. In­deed, re­searchers like Sh­tul­man have come to be­lieve that it’s not that we ever re­place our in­tu­itive the­o­ries so much as that we be­come pro­gres­sively bet­ter at in­hibit­ing them. This means that at­tempts to ed­u­cate stu­dents and the pub­lic at large must “forge a path,” in the au­thor’s phrase, be­tween the novice’s in­tu­itive the­ory and the ex­pert’s sci­en­tific un­der­stand­ing.

This can be ac­com­plished, he sug­gests, by em­ploy­ing “con­cep­tu­ally in­formed in­struc­tion” — that is, em­pir­i­cally sup­ported teach­ing strate­gies, such as “bridg­ing analo­gies.” Such analo­gies be­gin with sce­nar­ios that make in­tu­itive sense (a spring ex­erts an up­ward force on a book rest­ing on that spring) and grad­u­ally ex­tend to in­clude no­tions that, while sci­en­tif­i­cally unas­sail­able, ini­tially strike us as im­plau­si­ble (a ta­ble ex­erts an up­ward force on a book rest­ing on that ta­ble).

Although such “con­cep­tual change” is dif­fi­cult to achieve and per­haps never com­plete, it’s worth at­tempt­ing, Sh­tul­man con­cludes, be­cause sci­en­tific the­o­ries “fur­nish us with fun­da­men­tally more ac­cu­rate con­cep­tions of re­al­ity and thus fun­da­men­tally more pow­er­ful tools for pre­dict­ing it and con­trol­ling it” than in­tu­itive the­o­ries ever could. The bet­ter we un­der­stand ther­mal equi­lib­rium, the more likely we are to op­ti­mize our home heat­ing and cool­ing prac­tices; the bet­ter we un­der­stand the bi­o­log­i­cal mech­a­nisms of cold and flu trans­mis­sion, the more likely we are to take pre­cau­tions against get­ting sick.

To seek out a more sci­en­tif­i­cally in­formed view of the world, how­ever, we must rec­og­nize that our cur­rent view is in­ac­cu­rate or in­ad­e­quate — and such epipha­nies, ar­gue the au­thors of “The Knowledge Il­lu­sion,” are all too rare. Re­search in cog­ni­tive sci­ence demon­strates that “in­di­vid­ual knowledge is re­mark­ably shal­low, only scratch­ing the sur­face of the true com­plex­ity of the world, and yet we of­ten don’t re­al­ize how lit­tle we un­der­stand,” write Slo­man, a pro­fes­sor of cog­ni­tive, lin­guis­tic and psy­cho­log­i­cal sciences at Brown Uni­ver­sity, and Fern­bach, a cog­ni­tive sci­en­tist and pro­fes­sor of mar­ket­ing at the Uni­ver­sity of Colorado’s Leeds School of Busi­ness.

Like in­tu­itive the­o­ries, the “knowledge il­lu­sion” — the sense that we un­der­stand more than we do — grants hu­mans a rough-an­dready way of deal­ing with an in­tri­cate uni­verse. We should be glad that our an­ces­tors did not pause in the face of a rock slide to con­tem­plate how lit­tle they un­der­stood about grav­ity. In our mod­ern world, how­ever, false as­sur­ance can come with its own set of dan­gers.

The knowledge il­lu­sion — which aca­demics re­fer to as “the il­lu­sion of ex­plana­tory depth” — was first re­vealed by stud­ies of how well (or rather, how poorly) peo­ple grasped the work­ings of ev­ery­day ob­jects such as sta­plers, speedome­ters, pi­ano keys, door locks and flush toi­lets. Psy­chol­o­gists Frank Keil and Leon Rozen­blit dis­cov­ered that, although peo­ple con­fi­dently as­sumed that they un­der­stood the op­er­a­tion of these items, in fact they had no clue. In at­tempt­ing to ex­plain how such ob­jects work — to think through pro­cesses that pre­vi­ously they only skimmed over — Keil and Rozen­blit’s sub­jects came to rec­og­nize that they re­ally did not un­der­stand them af­ter all.

Build­ing on Keil and Rozen­blit’s re­search, Slo­man and Fern­bach have found that this same mis­ap­pre­hen­sion ap­plies to more ab­stract phe­nom­ena as well, such as tax pol­icy and for­eign re­la­tions — and to po­lit­i­cally charged top­ics such as cli­mate change and ge­net­i­cally mod­i­fied or­gan­isms. As with our in­tu­itive the­o­ries, the de­vel­op­ment of deeper un­der­stand­ing is not a straight­for­ward process of re­plac­ing or sup­ple­ment­ing in­ac­cu­rate or in­com­plete in­for­ma­tion. Un­so­phis­ti­cated ef­forts at ed­u­ca­tion are of­ten in­ef­fec­tive and can even back­fire. Slo­man and Fern­bach de­scribe a study in which par­ents who were shown im­ages of chil­dren with measles, mumps or rubella, or who were given an emo­tional story about a child who con­tracted measles, ac­tu­ally be­came more con­vinced of the dan­gers of pre­ven­tive vac­cines.

Once again, re­search in cog­ni­tive sci­ence of­fers a savvier way around our men­tal blocks. Sim­ply ask­ing peo­ple to ex­plain how vac­cines work, or how a sin­gle-payer health-care sys­tem op­er­ates, or how a na­tional flat tax would func­tion, im­me­di­ately ren­ders them more cog­nizant of how lit­tle they un­der­stand these is­sues. It makes them more hum­ble and more re­cep­tive to in­for­ma­tion that chal­lenges the be­liefs they pre­vi­ously ex­pressed with such ve­he­mence.

The au­thors of “The Knowledge Il­lu­sion” and “Scienceblind” both de­scribe the re­search of Michael Ran­ney, a psy­chol­o­gist at the Uni­ver­sity of Cal­i­for­nia at Berke­ley. In Slo­man and Fern­bach’s ac­count, Ran­ney “ap­proached a cou­ple of hun­dred peo­ple in parks in San Diego and asked a se­ries of ques­tions to gauge their un­der­stand­ing of the cli­mate change mech­a­nism. Only 12 per­cent of re­spon­dents were even par­tially cor­rect, men­tion­ing at­mo­spheric gases trap­ping heat. Es­sen­tially no one could give a com­plete, ac­cu­rate ac­count of the mech­a­nism.” Ran­ney then gave par­tic­i­pants a short text (or, in later stud­ies, showed them a video) ex­plain­ing how cli­mate change op­er­ates. This two-step process, Slo­man and Fern­bach re­port, “dra­mat­i­cally in­creased their un­der­stand­ing and their ac­cep­tance of hu­man-caused cli­mate change.”

In this ex­per­i­ment, Slo­man and Fern­bach see the knowledge il­lu­sion be­ing ex­posed and cor­rected. Sh­tul­man, mean­while, sees our in­tu­itive the­o­ries be­ing ad­dressed and re­framed. In­tu­itively, “we don’t re­ally think of the earth as some­thing that needs pro­tec­tion,” he ob­serves. “How can we harm an ob­ject that is seem­ingly eter­nal?” Ran­ney’s clear ex­pla­na­tion of how we are, in fact, harm­ing the Earth with the pro­duc­tion of green­house gases prompts our in­tu­itive im­pulses to shift in the di­rec­tion of pro­tect­ing an en­dan­gered planet.

Though they fo­cus on dif­fer­ent de­fects in the hu­man op­er­at­ing sys­tem, the au­thors of these two books ar­rive at the same so­lu­tion: To move away from ig­no­rance and to­ward un­der­stand­ing, we need to ad­dress di­rectly what Slo­man and Fern­bach call “the driv­ing forces” be­hind our ob­tuse­ness. Sur­pris­ingly, for once, that ob­tuse­ness is not pro­duced by our pol­i­tics but by the evo­lu­tion­ary his­tory of our brains. Who knew, as Pres­i­dent Trump re­marked re­cently of health care, that things “could be so com­pli­cated”?

Newspapers in English

Newspapers from USA

© PressReader. All rights reserved.