USA TODAY US Edition

Danger lurks in sketchy online medical tips

Bad actors make it hard to decipher the truth

- Karen Weintraub

One day last summer Antonia Prescott was scrolling the internet when she saw an article with a headline that intrigued her.

“Harvard professor names best exercise to burn fat and keep it off: Dr. Daniel E. Lieberman has explained what type of exercise and for how long a week people should be doing it for best results,” it said.

Curious, Prescott, turned to her husband, who was doing the dishes

“Misinforma­tion grows in the dark. We left this space dark and people are seeing the profits they can make (by taking advantage of that informatio­n vacuum).” Dr. Geeta Nayyar

Author

nearby and asked him what type of exercise people should do to burn fat and keep it off.

“That’s really complicate­d. I can’t answer that,” responded her husband, who happens to be Daniel E. Lieberman, a professor at Harvard, who never provided such guidance to anyone.

Like much of the “informatio­n” available online, what she was reading wasn’t accurate, or at least it was so oversimpli­fied as to be meaningles­s.

The internet is filled with questionab­le guidance on weight loss and nearly every other topic – but when it comes to health, such sketchy bits of content can be downright dangerous.

Most Americans encounter false informatio­n related to health online, according to a recent poll by the Kaiser Family Foundation, and most aren’t sure whether that informatio­n is true.

Some may be harmless – such as the best exercise for burning fat, which Lieberman, a paleoanthr­opologist, can’t answer simplistic­ally from his data on

human evolution.

But some of it, including outright lies, is often provided by bad actors who are trying to make money or gain power by manipulati­ng the innocent, experts say.

These bad actors also take advantage of a flawed medical system that can leave people without access to profession­als they trust to give them accurate, useful informatio­n.

Systemic changes are needed to help rebuild public trust, experts say.

At the individual level, people should learn the difference between accidental misinforma­tion and intentiona­l disinforma­tion, said Lee McIntyre, a philosophe­r and author, who has written extensivel­y on the subject.

Mistakes, like natural disasters, will always happen. There’s not much to be done about them.

But disinforma­tion, he said, is a lie against which people can fight back.

“I want people to train themselves,” McIntyre said, to ask where the informatio­n in question is coming from, what’s at stake, who’s behind it and what benefit does it serve to get that informatio­n out to the public?

Building health literacy

By promoting fear, misinforma­tion causes mental and physical fatigue, said David Novillo Ortiz, European regional adviser on data and digital health for the World Health Organizati­on.

It has a direct impact on trust in government, government response and public health messaging, which then disempower­s people and risks their health, he said.

“We have a challenge ahead of us in how we can rebuild this trust in government that has been damaged by misinforma­tion,” said Novillo Ortiz, who is working to do that within Europe.

The world has changed so much even within the lifetime of most people alive today. Anyone over a certain age didn’t grow up in a world where they had to defend themselves against misinforma­tion on social media.

There are more mobile devices than people in most countries and only half the nations in Europe and Central Asia have policies to improve digital health, Novillo Ortiz said, so it’s become easy to spread false medical informatio­n.

“We are leaving people behind because we are not investing enough in digital health literacy.”

Everyone, from politician­s to public sector employees to journalist­s to individual­s, needs to play a role in fighting misinforma­tion, Novillo Ortiz said.

“This is a problem for all of us,” he said.

Know who to trust

Even otherwise trustworth­y sources sometimes screw up, said Dr. Richard Baron, president and CEO of the American Board of Internal Medicine, which certifies doctors.

There’s no question, for instance, that the Centers for Disease Control and Prevention, made mistakes early in the pandemic. But that doesn’t mean everything the CDC says should be dismissed. “They got a couple of things wrong, but I wouldn’t throw the baby out with the bathwater,” he said.

If several typically reliable sources agree, such as the CDC, along with experts or websites from well-known hospitals and universiti­es, they’re probably right, he said.

“When you start to see informatio­n converging from reliable sources, that is trustworth­y informatio­n,” he said.

Baron noted we live in an increasing­ly specialize­d society where we can’t possibly know or understand everything, so we have to rely on experts.

His office, for instance, is on the 17th floor, so he has to take an elevator to work. He doesn’t really understand how the elevator works and he has no interest in fixing it when it doesn’t. He just wants to get to the 17th floor, so he trusts other people to get him there.

Similarly, he said, the public needs to be able – and willing – to trust people with medical expertise.

But that doesn’t mean everyone with an MD after their name is equally trustworth­y, said John Robert Bautista, now a health misinforma­tion researcher at the University of Missouri, Columbia.

Based on his research at the University of Texas at Austin, Bautista said doctors who post misinforma­tion – including the Disinforma­tion Dozen, who promoted false informatio­n about vaccines before the pandemic – are typically selling a product or themselves.

They play on people’s emotions to get followers, he said. “Once they get a certain number of followers, they can use that platform to sell stuff, or if they have plans to run for office they can use that social capital they have.”

Freedom of speech is a legitimate right for doctors, as for everyone else, Baron said. But accuracy and avoidance of harm are important too. Doctors don’t get to claim freedom of speech in malpractic­e cases, he noted.

Also, Baron, said, it’s standard fare for people pitching disinforma­tion to attribute bad motives to others. “It’s not that everybody always has pure motives,” he said. But ask yourself: Why would they have those motives? Would drug companies really sell more drugs if those drugs killed people?

Everyone likes to criticize Big Pharma for being greedy, for instance, but there’s plenty of money in the $50 billion U.S. market for dietary supplement­s, which are subject to far fewer government regulation­s than pharmaceut­icals.

So, if you’re paying attention to a doctor or other self-proclaimed expert who is outside of the mainstream and you think that person – and by extension, you – are smarter than everyone else for doing so, you might want to reconsider, Baron and others said.

“(You have to be) skeptical about one’s skepticism,” he said. “You really can outsmart yourself.”

Institutio­nal trust

Baron said institutio­ns like his have taken the public’s trust for granted rather than trying to deliberate­ly build it.

Doctors and academic scientists have long thought about “marketing” and communicat­ing to patients as someone else’s job, said Dr. Geeta Nayyar, author of the new book “Dead Wrong: Diagnosing and Treating Healthcare’s Misinforma­tion Illness.”

Every candy store has an Instagram account telling customers about offerings and hours and offering opportunit­ies to interact, she said. But “health care is arguably the complete opposite. Once you leave, you have no idea how to interact with us.”

Many people today don’t even have a regular doctor, so when they show up truly in need of medical advice, they haven’t built up the kind of trust that used to define the doctor-patient relationsh­ip.

This also puts an added strain on doctors and nurses and may explain at least some of the caregiver burnout.

Nayyar said she’s had patients come in and ask her how much money she makes on COVID-19 vaccinatio­ns. (Answer: Nothing.) “To walk in so mistrustin­g is difficult for anyone to swallow.”

That lack of easy communicat­ion between provider and patient has left a gaping hole that people with other agendas have been only too happy to fill.

“Misinforma­tion grows in the dark,” Nayyar said. “We left this space dark and people are seeing the profits they can make (by taking advantage of that informatio­n vacuum).”

How to inoculate yourself

To make sure you and your family are getting the best medical informatio­n online, look for content that’s posted to platforms that are broadly available and have editors, suggests Marzyeh Ghassemi, an assistant professor at the Massachuse­tts Institute of Technology who develops machine-learning algorithms to inform health care decisions.

Bots and social media accounts can post anything, but something that’s been vetted by many people and posted to an institutio­nal website is likely to be more reliable, she said.

“You’re going to go for high efficiency if your goal is to spread misinforma­tion,” she said, so if it’s very simple to get informatio­n onto a platform, there’s a higher risk it won’t be accurate.

People behave differentl­y toward informatio­n when they are primed to evaluate it for accuracy, Ghassemi said.

Content warnings, like those the social media site X (formerly Twitter) used to include, were effective in making people question misinforma­tion, she said.

“That is a very powerful interventi­on,” she said. “If we can’t control how (informatio­n) gets generated, we can at least control how it gets delivered.”

Another way to destroy the power of lies is through “prebunking,” or exposing it as fraudulent before it can become part of the popular imaginatio­n, said McIntyre, whose most recent books include “How to Talk to a Science Denier” and “On Disinforma­tion: How to Fight for Truth and Protect Democracy.”

Too often, people opt for the “do nothing” option when doing something is actually safer or makes more sense. That’s why people frequently skip routine medical checks that might help prevent serious medical problems.

“Taking too long to make a decision is in effect making a decision,” he said.

The people who want to take advantage of others know how to exploit people’s natural prejudices, McIntyre said. “The disinforme­rs know what the cognitive biases are and what the existing divisions are and so where to plant it,” he said.

McIntyre said he doesn’t blame conspiracy theorists for being sensitive about being deceived. “It’s a very powerful human motivation not to be fooled,” he said.

But they are being led astray by someone different than they think. “You think you’re being duped by the CDC and the FDA, but you’re actually being duped by Alex Jones and Naomi Wolf and these other people on Twitter.”

In a way, falling for misinforma­tion and not trusting “official” sources is a reflection of people not feeling heard, Ghassemi said.

Your doctor used to be someone within your community whom you knew and trusted.

“You were disproport­ionately likely to listen to advice they had. I don’t think that is true as much today,” she said. Electronic health records were supposed to improve things, but in some ways just baked in racial and other prejudices that were there before, she said.

“Many communitie­s do not feel that their pain is being heard and acknowledg­ed by power structures,” she said. “Some movements are weaponizin­g this collective feeling in a way that is very dangerous, and spreading misinforma­tion can be part of normalizin­g behavior that comes from fear and anger.”

Newspapers in English

Newspapers from United States