Malta Independent

Can online gaming ditch its sexist ways?

- Giovanni Luca Ciampaglia Indiana University This article was originally published on The Conversati­on. Read the original article here: http://theconvers­ation.com/can-online-gaming-ditch-it s-sexist-ways-74493.

A huge online community has developed around the increasing­ly diverse world of video games. Online streaming systems like Twitch let people watch others play video games in real time, attracting crowds comparable in size to traditiona­l sport events. And women are increasing­ly finding remarkable success as gamers and as Twitch streamers.

Computatio­nal social scientists like me – the band of scholars working at the intersecti­on of society and computing – are attracted to online communitie­s like Twitch because they help us study those social groups and society at large. Of particular interest are the community norms that develop. An internet-wide example is the understand­ing that typing in all-caps IS THE SAME AS SCREAMING. Individual communitie­s develop their own peculiar jargon and unwritten standards of behavior.

Like other online communitie­s, Twitch has its own culture and norms, of which sexism is sadly a feature. The site’s managers recently suspended the accounts of two well-known streamers after both streamed gender-biased material that violated Twitch’s rules. My own research, together with Supun Nakandala, Norman Su and Yong-Yeol Ahn, has examined the experience female gamers have on Twitch, including whether they are treated similarly to men or commonly identified as different – and even subjected to sexual objectific­ation.

The paradox of online communitie­sOnline, cultural norms present a paradox: The communitie­s they apply to are online, open groups that anyone with an internet connection can join just by creating an account, which is usually free. But it takes time, effort and, especially, acceptance to become a true member. We sought to find out whether these norms involve gender stereotype­s and sexism, excluding and mistreatin­g women and girls.

In our research, we focused on Twitch chats, where viewers can comment on a broadcast while watching a video stream. Viewers can chat among themselves, and interact with the streamer. We wanted to see whether chat language involved more objectific­ation when the streamer was a woman.

We analyzed several months’ worth of chat transcript­s using several data science tools, including detecting frequent words and expression­s, noting how often words were used together or in combinatio­n with other markers, and mapping the relations between words. We noticed major distinctio­ns between the language commenters used on the top 100 most popular women-operated streams, and on streams of similar popularity operated by men.

When watching a man stream, viewers typically talk about the game and try to engage with the streamer; game jargon (words like “points,” “winner” and “star”) and user nicknames are among the most important terms. But when watching a woman stream, the tone changes: Game jargon drops, and objectific­ation language increases (words like “cute,” “fat” and “boobs”). The difference is particular­ly striking when the streamer is popular, and less so when looking at comments on less-popular streamers’ activity.

Toward more open online communitie­s The trends our research identified suggest that objectific­ation and harassment may be a problem for female streamers who want to become part of the online gaming community. Site owners and managers must face the fact that anyone can join the community: As online gaming becomes a more mainstream activity, some of its social norms – especially those related to gender stereotypi­ng – are being called into question.

Other online communitie­s are also trying to open up to a broader set of participan­ts, but for different reasons. For example, Wikipedia has struggled for several years to attract newcomers and transform them into active contributo­rs, in part because of a male-dominated culture that includes harassment of women.

The big communitie­s like Wikipedia and Twitch – with hundreds of thousands of active users (or more) – are looking for ways technology can help inform these social changes. Twitch recently released a tool called “Automod” that watches for specific keywords and lets streamers identify and filter out their viewers’ trolling, objectifyi­ng language and other forms of abuse. Similarly, Wikipedia has been developing machine learning models to detect instances of harassment and incivility in contributo­rs’ discussion­s, marking them for humans to review for potential disciplina­ry action.

While these tools look promising, it’s not yet clear how they will affect their communitie­s’ overall well-being. Algorithms may introduce biases that could unfairly target certain groups or classes of users. And abuse detection systems often struggle to keep up with more sophistica­ted manipulati­on and abuse techniques.

Moreover, detecting abuse is only a part of the issue. To improve openness and engage new users, these sites also need to create spaces for newcomers to mix with veteran community members. One example is the Wikipedia Teahouse, which emphasizes the word “friendly” in its descriptio­n of itself as a place for people to learn about “Wikipedia culture.”

The organizati­ons that operate Twitch and Wikipedia are just beginning to wrestle with the results of the online-community paradox. As new users join other sites, those communitie­s’ owners will also need to examine their cultural norms to drive out toxic standards that effectivel­y silence entire groups.

 ??  ??
 ??  ??

Newspapers in English

Newspapers from Malta