USA TODAY US Edition

When talking trust, go with your gut

It plays bigger role in first impression­s

- Jeff Stibel

The last time I connected with a friend online, a funny thought occurred to me: It wasn’t really a friend at all. Instead, I was talking through and to a computer. This wasn’t a particular­ly profound insight. But when it comes to trust, how you communicat­e has become more important than who you are communicat­ing with.

You may not realize it, but our guts play a bigger role in deciding whom to trust than our brains. Your brain decides on trust much more quickly than you think. When we encounter a stranger, our brains automatica­lly decide whether we can trust him or her. And it’s done subconscio­usly and extremely quickly — in millisecon­ds, in fact — in the part of the brain called the amygdala.

From a survival perspectiv­e, it’s important to be able to identify who is trustworth­y and who is not, and numerous experiment­s have shown that our first impression­s about someone are usually right. Within millisecon­ds, we are good at determinin­g whether people are friendly, intelligen­t, wealthy and numerous other traits. If your genes have made it this far — and all of ours have — it’s because you’re very good at picking out who to trust.

Our brains are less wellequipp­ed to decide whether a company is trustworth­y, but there’s evidence to suggest that generally, we give companies the benefit of the doubt, which holds right up until they do something we don’t like. It is even worse with technology companies, as we tend to look past the company and base our decisions on the usefulness of the product.

Enter Facebook, which is in some hot water these days. We have been willingly trusting Facebook for years, sharing gobs of personal data, knowing Facebook uses that data to target us with ads. We have become pretty comfortabl­e with the arrangemen­t: keep up with friends and enjoy funny cat videos with no cost except having to view the occasional sponsored post.

But recently, we’ve discovered Facebook data has been used in less benign ways. Facebook’s partners are after your mind, and they’ve learned that if you’re angry and fearful, you’re more engaged, since negativity has a bigger impact on the brain than positive messages. Remember the amygda- la — it is also the response center for fear and anger.

Facebook has the same knowledge about you as your friends, so it knows exactly what makes you angry or anxious. That gives companies who buy data from Facebook the ability to target certain segments of the population with negative messages. Add bots and trolls to the mix, plus intentiona­lly fake content created by hostile foreign powers, and it’s clear how dangerous this can be.

We may not mind our data being used to sell us stuff, but no one is OK with data being used in nefarious ways. Hence a pending congressio­nal investigat­ion and many loud calls for a mass Facebook boycott. Facebook has lost our collective trust, and what it does next will be critical in determinin­g whether it can gain it back. But the problem is bigger than Facebook: You can swap Facebook out for any technology that collects user data, and the problem will persist.

❚ What’s the solution? There is a simple way forward: Create absolute transparen­cy. Tech providers should label all posts, users or feeds that have received data about its users. It’s as easy as one sentence presented alongside any post that leverages user data, such as “this product is recommende­d based on your clicks on our site” or “this news post was sent to you because you are friends with Vladimir Putin.”

A recent study out of Harvard Business School suggests this approach may even be beneficial to advertiser­s. The experiment­s demonstrat­ed that when companies clearly communicat­e how they are using customer data, customers show increased engagement and actually make more purchases. So, the user gets transparen­cy and the company sees increased engagement — a win-win for all but those errant data violators.

It is unlikely we will be able to curb the sale of personal data, but we can control how it is used. By obligating disclosure­s, all posts from sources that received data would forever be tagged with a disclaimer. Facebook and other data providers would lose money from bad actors, but they would gain the trust of the good ones.

But until the majority of companies join the transparen­cy bandwagon, it pays to be vigilant. Recognize that while your brain can instantly spot a trustworth­y face, it takes a lot more work to determine whether to trust an ad, an article or a website. If something appears sensationa­l, misplaced or like it fits a bit too perfectly into your world view, it may just be the content was written solely to manipulate you. In this era of fake news and data misuse, your best bet is to approach all online content with a healthy dose of skepticism.

Jeff Stibel is vice chairman of Dun & Bradstreet, a partner of Bryant Stibel and an entreprene­ur who also happens to be a brain scientist. He is the bestsellin­g author of “Breakpoint” and “Wired for Thought.” Follow him on Twitter at @stibel.

The views and opinions expressed in this column are the author’s and do not necessaril­y reflect those of USA TODAY.

We may not mind our data being used to sell us stuff, but no one is OK with data being used in nefarious ways.

 ?? RICHARD DREW/AP ??
RICHARD DREW/AP
 ??  ??

Newspapers in English

Newspapers from United States