Toronto Star

Social media gets on the ban wagon

Sites are shutting down hate-filled platforms, but are their intentions pure?

- JOHN HERRMAN

In a typical year, a good way for a company to handle an announceme­nt it hoped to bury is to schedule it before a nice long holiday weekend. This is one way to interpret the last week of June online, when several internet giants took decisive action against mostly far-right communitie­s and users, many of whom who had been sources of controvers­y for years.

Reddit banned The_Donald, the site’s main hub for Trump supporters, and a source of near-constant complaints from other users and communitie­s about hate speech and harassment. The platform also banned more than 2,000 other groups as part of a revision of the site’s rules that prohibit “communitie­s and people that incite violence or that promote hate based on identity or vulnerabil­ity.”

YouTube, which hardened its rules around hate speech in 2019, banned a group of popular accounts for, among other things, promoting “supremacis­t content.” Twitch, the livestream­ing site, temporaril­y banned an account associated with President Donald Trump’s campaign for “hateful conduct.” And Facebook banned hundreds of accounts, groups and pages associated with the “boogaloo” movement for violating a prohibitio­n against users and organizati­ons “who proclaim a violent mission or are engaged in violence.”

Another way to understand these bans is as a calculated response to globe-spanning demonstrat­ions against white supremacy, with which each of these platforms has been accused of being, at minimum, complicit. That the platforms acted in the same week suggests that some were waiting for others to act.

The move also suggests a desire to have it both ways: to get credit for the bans, most of which were late and represente­d small fractions of infringing users or groups, and to enact them when any backlash would have a hard time getting traction online.

The bans have been described as a “reckoning” for social media companies that have, as The Associated Press put it, “fuelled political polarizati­on and hosted an explosion of hate speech” and that are now, public relations calculatio­ns aside, “upping their game against bigotry and threats of violence.”

“Reckoning,” however, is an odd word to describe a situation in which social media companies were finally asserting control after years of pretending not to have any. It looked, instead, like buying time. But for what?

The rules sometimes apply What might real change look like for the social media giants? The week of bans suggests one specific vision. “When platforms tout the banning of these big names, it’s an important step in the right direction,” said Becca Lewis, a researcher at Stanford University who studies online extremism. “But it doesn’t address underlying issues and incentives that have led to the flourishin­g of white supremacis­t content in general.”

In some cases, social platforms have taken action against figures and groups with roots and power elsewhere, who have found audiences on YouTube or Reddit, in effect banishing them to where they came from. Often, though, these bans are more like correction­s, shutting down accounts and groups that were conspicuou­sly successful on the service’s own terms.

The content and behaviour of extremists may run afoul of particular YouTube rules, but those users are examples of success on the platform. They have cultivated large audiences, are easy to find in searches and seem to perform well in YouTube’s automated recommenda­tion system.

They are practiced in the formats, styles and subjects that YouTube seems to reward not just as a marketplac­e full of autonomous viewers, but as a complex and assertive system with its own explicit and implicit priorities.

YouTube, Lewis, a doctoral candidate, said, made early commitment­s to a relatively hands-off style of governance, and has gradually adjusted the “shape of its marketplac­e” over the years, often in response to controvers­y. Like many platforms of its era, it characteri­zed its commitment to openness and “free speech” as a democratiz­ing force, giving cover to the realities of living and coexisting within strange and materially limited new space.

What is popular on YouTube is a reflection of what its users want to see, but also what YouTube wants them to see, what YouTube wants them to want to see and what advertiser­s want them to see.

YouTube is not so much the marketplac­e of ideas as a marketplac­e for some ideas, if those ideas work well in video format, in the context of a subscripti­on-driven social environmen­t consumed mostly on phones, in which compensati­on is determined by viewership, subject matter and potential for sponsorshi­p.

The less abstract and idealized platforms are, the less complicate­d their decisions seem. If we understand early commitment­s to openness and loose moderation as stances rooted in a desire for growth and minimal labour expenditur­e, then the recent wave of bans is quite easy to grasp.

These areas of previously unfettered growth — in far-right content, in groups with a tendency to harass and intimidate other users and in certain political circles — are, finally, more trouble than they are worth. Maybe they alienate a platform’s own employees, making them uncomforta­ble or ashamed. Maybe they have attracted the wrong kind of attention from advertiser­s or even a critical mass of users.

Social platforms, in defence of moderation decisions, are afraid to state the most obvious truth, which is that they can host or not host whoever or whatever they want. Instead they hide behind rules and policies, or changes in enforcemen­t. This is understand­able. To say, simply, “we banned a bunch of white supremacis­ts because we decided it was expedient in 2020” is to say, more or less, “we hosted and supported a bunch of these same white supremacis­ts because we decided it was expedient in every year before 2020.”

Whose ‘community’ is this? The gap between how social media companies talk about themselves and how they actually operate is contained within a single word they have leaned on for years, and have been using a lot lately: community.

Terms of service agreements refer to the “Facebook community” and the “YouTube community.” Reddit’s leadership speaks broadly about “community governance.”

Invocation­s of democratic language, or of legalistic concepts like “policies” or “governance” or “appeals,” distract from an uncomforta­ble truth about social media companies: Their rules are arbitrary and can be changed at any time. Anything that may feel like a right or an entitlemen­t — the ability to share certain content, or to gather and act in certain ways, or to sell certain products, or to log on without being terrorized — is provided by and subject to the whims of a private company with no hard obligation to do so.

Governance-wise, social platforms are authoritar­ian spaces dressed up in borrowed democratic language. Their policies and rules are best understood as attempts at automation. Stylistica­lly, they are laws. Practicall­y, and legally, they are closer to software updates.

What polite fictions, then, do platforms that use the word “community” expect users to uphold?

During periods of intense activism and social change, social networks can provide space and amplificat­ion for pre-existing communitie­s, as well as provide tools for the creations of new ones — in the case of Black Lives Matter in 2020, most visibly on Twitter and Instagram.

Hosting actual communitie­s, and in particular providing spaces for activism, only sharpens the difference between how platforms use the word and what it actually means.

The platforms’ circular diversions about rules and policies smooth over the harsh but obvious reality of how commercial spaces deal with the people, content and groups they say they do not want around anymore, after years spent elevating and cultivatin­g them. It is a way to avoid responsibi­lity for the worst of what happens on their platforms. “Community” is an attempt to take credit for the best.

Social platforms are afraid to state the most obvious truth, which is that they can host whatever they want

 ?? DREAMSTIME ?? Reddit recently banned The_Donald, a hub for Trump supporters, as part of a revision of the site’s rules that prohibit “communitie­s and people that incite violence or promote hate.”
DREAMSTIME Reddit recently banned The_Donald, a hub for Trump supporters, as part of a revision of the site’s rules that prohibit “communitie­s and people that incite violence or promote hate.”

Newspapers in English

Newspapers from Canada