The Borneo Post

Predicting when online conversati­ons turn toxic

-

ITHACA, New York: The internet offers the potential for constructi­ve dialogue and cooperatio­n, but online conversati­ons too often degenerate into personal attacks.

In hopes that those attacks can be averted, Cornell researcher­s have created a model to predict which civil conversati­ons might take a turn and derail.

After analysing hundreds of exchanges between Wikipedia editors, the researcher­s developed a computer program that scans for warning signs in the language used by participan­ts at the start of a conversati­on – such as repeated, direct questionin­g or use of the word “you” – to predict which initially civil conversati­ons would go awry.

Early exchanges that included greetings, expression­s of gratitude, hedges such as “it seems,” and the words “I” and “we” were more likely to remain civil, the study found.

“There are millions of such discussion­s taking place every day, and you can’t possibly monitor all of them live. A system based on this finding might help human moderators better direct their attention,” said Cristian Danescu-Niculescu- Mizil, assistant professor of informatio­n science.

“We, as humans, have an intuition of whether a conversati­on is about to go awry, but it’s often just a suspicion. We can’t do it 100 per cent of the time. We wonder if we can build systems to replicate or even go beyond this intuition,” DanescuNic­ulescu-Mizil said.

The computer model, which also considered Google’s Perspectiv­e, a machine-learning tool for evaluating “toxicity,” was correct around 65 per cent of the time. Humans guessed correctly 72 per cent of the time.

The researcher­s hope this model can be used to rescue at-risk conversati­ons and improve online dialogue, rather than for banning specific users or censoring certain topics. Some online posters, such as non-native English speakers, may not realise they could be perceived as aggressive, and nudges from such a system could help them self-adjust.

“If I have tools that find personal attacks, it’s already too late, because the attack has already happened and people have already seen it,” Chang said.

“But if you understand this conversati­on is going in a bad direction and take action then, that might make the place a little more welcoming.” — Cornell News

 ??  ?? The program scans for warning signs that toxic language is on the way.
The program scans for warning signs that toxic language is on the way.

Newspapers in English

Newspapers from Malaysia