East Bay Times

Tech writer Larry Magid on social media, extremism.

- Larry Magid Digital crossroads

I never thought I would have to say this but— political extremism is expected to have an impact on the upcoming election and — perhaps even scarier — the potential aftermath of the election, should some people decide to take things into their own hands if it doesn’t come out as they hope.

The president’s response to moderator

Chris Wallace’s Tuesday night debate question about white supremacy didn’t help, when he advised the extremist group Proud Boys to “Stand down and stand by.”

Social media is likely part of the blame, but there are, of course, other reasons for any increase in white supremacy and extremism which, FBI Director Chris Wray called the biggest domestic terrorism threat in recent testimony before the House Homeland Security Committee.

The Netflix movie, The Social Dilemma, helps explain why this is true. Algorithms designed to present us with content we find compelling, along with advertisin­g that’s likely to interest us, fill our newsfeed with posts that appeal to our political leanings, world views, susceptibi­lity to various theories and points of view, including conspiracy theories. And, as the movie points out, it’s not that Mark Zuckerberg and other tech leaders set out to enable people to push us to political extremes. It’s an unintended consequenc­e of brilliantl­y written code designed to give what the algorithms think we want to see. The movie includes interviews with former employees of Facebook, Twitter and other tech companies, including those who are now very critical of their former employer. But, as some in the movie point out, these same services also do a lot of good including helping disaster victims, raising funds for worthy causes and helping people organize social justice campaigns.

The tendency for social media to nudge people into dark places is not entirely the fault of the technology. There is also the bubble you put yourself in based on who you friend and interact with and the content you chose to view.

I’ve seen almost nothing on my social feed that even comes close to white supremacy and other hate speech. I go out of my way to interact with people with differing points of view, but almost all are civil and polite and I avoid people who are obvious bigots so that, too, helps share what I see. I also don’t belong to very many groups, which can influence what you see even if it seemingly has nothing to do with the groups you’re in.

You could, for example, join a group around a totally nonpolitic­al interest where there

happens to be a connection because a significan­t number of people in that group are also in political groups, which may have nothing to do with the interests you share in common. You could also like something or click on links just because you’re curious, and all that is recorded and analyzed by the codes that help determine what you will see.

Facebook has a tool that gives you some sense of what it thinks your interested in. It’s not complete, but it does show you some of the things it knows about you that it uses to display targeted ads. You can access it at tinyurl.com/fbinterest­s, or if you go to Settings and then Ads, you’ll see a link for “Your Interests” and that’s broken down into categories including business and industry, news and entertainm­ent, travel, and educa

tion. Click on the More link and one option is “Lifestyle and Culture.” When I clicked on that, I found out that I was interested in both the Democratic and Republican parties, happiness, homelessne­ss, Black Friday shopping, Democracy, and instant messaging. If you hover over each interest, it will vaguely tell you why it thinks it’s relevant — typically because you may have liked one of their pages or one of their posts or clicked on an ad related to their page.

Twitter has a page called Interests from Twitter (tinyurl.com/twitterint­erests), which gives you some — but not much — insight into what they think they know about you. I’m apparently interested in both Alexandria Ocasio- Cortez and Donald Trump Jr. along with education, drinks, gaming, British football, animals and auto racing, even though some of those topics barely interest me. But the fact that Twit

ter thinks I’m interested in both AOC and Donald Trump Jr. along with his father and Barack Obama, helps explain why I see such diverse political content.

Facebook and Twitter have been working on this problem and have made progress in the last few months. I spoke with one expert on background who told me that it’s now harder to stumble into extremist content, but that it’s still there if you know where to look and not impossible find extremist groups based on your connection­s, friends, likes and other activity.

I’m not exactly sure what Twitter and Facebook can do to reignin their algorithms, but I suspect there is a way that they can help people avoid being pulled into directions that they might not otherwise go. What they can definitely do is take down extremist content as soon as they’re aware of it. They’re doing more of that, which is a good thing, but there is

a lot of content that still shows up, which helps feed political extremism. And when they do take down extremist content, they get pushback not only from the groups themselves but public officials who feel that the companies are censoring political speech.

In the meantime, there are things we can all do including not sharing content that may not be truthful, not joining groups whose views you detest, and putting thought into what you like or otherwise react to.

Clearly, we should not believe everything we see on social media and use independen­t means to verify anything that seems to be fishy. You’ll find more advice in the Guide to Media Literacy and Fake News I cowrote at ConnectSaf­ely.org/Fakenews

Disclosure: Larry Magid is CEO of ConnectSaf­ely.org which receives financial support from companies mentioned in this article.

 ??  ??
 ??  ??

Newspapers in English

Newspapers from United States