Business World

Start-ups see opportunit­y in tackling fake news

- By Hannah Kuchler in San Francisco

When Adam Hildreth started Dubit, an early social network for children during the last tech boom, he never imagined that he would end up at the forefront of fighting fake news influencin­g elections.

But the teen entreprene­ur of 1999 has grown up to become the chief executive of Crisp, trying to apply what he learnt moderating cyberbully­ing to stop the spread of terrorist content online — and now experiment­ing with how to combat misinforma­tion campaigns.

Crisp helps brands protect their reputation on social media, using an algorithm to crawl the internet — including the dark web — to understand who is distributi­ng content online.

“It is the opposite of a search engine, it graphs all the places you don't want to visit, all the place advertiser­s don't want to be seen,” says Mr. Hildreth.

Crisp is one of a group of companies starting to see an opportunit­y in helping either the social platforms tackle the problem or working for the victims of fake news, be they companies or government­s.

While large internet companies such as Facebook, Google and Twitter struggle to find ways to stop the spread of misinforma­tion online without abandoning their algorithms or business models, smaller start- ups are looking for ways to help clients willing to pay for extra help fighting fake news.

Some such as Crisp or New Knowledge started out fighting terrorism. Others such as Cisco and Digital Shadows are seeing the parallels with cyber security, using tactics developed to defend against hackers to battle against fake news.

Crisp, based in Leeds, has 120 employees and 300 contractor­s who help train its technology. It has been working with social platforms to try to make moderation more efficient. Under political pressure to show they are taking misinforma­tion campaigns seriously, both Facebook and Google have announced significan­t expansions of their moderation teams in recent months.

“The big challenge is that so much is uploaded every minute,” Mr. Hildreth says.

Crisp also helps brands look for anything that could damage their reputation, including real or fake news.

Jonathon Morgan, chief executive of New Knowledge, a start-up based in Austin, Texas, was a profession­al blogger who became an expert on Isis' use of social media. Now he is trying to help companies, political campaigns and social justice organizati­ons understand how online communitie­s can be manipulate­d.

New Knowledge has seen revenues double in the past six months since it started focusing on misinforma­tion. It uses machine learning technology to identify bots and break down different topics of conversati­on to spot where people are able to change the language used to discuss a topic, a sign that a community may be changing its beliefs.

Mr. Morgan says that if organizati­ons spot the misinforma­tion early enough, they can take action.

“Let's say we detect early on that people are working together to push a narrative that Beyoncé is a Russian spy,” he says. “That's ridiculous. So if we see that early enough, before it is trending on Twitter or on InfoWars or Fox, we can come up with an alternativ­e: Beyoncé is an American patriot.”

Cisco, the networking equipment company with a large cyber security arm, won a Fake News Challenge to design technologi­es that can help people detect the “stance” of a news article.

Researcher­s used natural language processing, whereby a computer is taught to understand the nuances of human speech, to detect whether a headline is related to the body of the text because many fake news stories copy the model of clickbait to lure people to visit a website. The team won by combining machine learning techniques, which are inspired by biological processes, including decision trees, a predictive modeling approach.

Digital Shadows, a San Francisco and London-based cyber security company, specialize­s in understand­ing what hackers are doing on the dark web. Companies often turn to the company to monitor if, for example, large databases full of their customer data were for sale, evidence that they have experience­d a security breach. It combines technology with threat intelligen­ce analysts, some with military background­s and many of whom speak foreign languages.

Alastair Paterson, chief executive of Digital Shadows, says the fake news that was spread during the US election used similar techniques as hacking groups.

“There's an interestin­g crossover between social media and cyber security right now more than ever before,” he says. “Social networks have so far been very impotent in doing anything about it.”

Digital Shadows counts broadcaste­rs among its clients. For some of the largest organizati­ons, it has identified and issued takedown notices for fake websites and social media accounts in more than 100 separate incidents.

For other companies, it also finds fake domains and social media profiles. It once found an entire phoney arm of a Dutch company that had been set up online.

Distil Networks, is another cyber security company with skills that could help solve the problems faced by social media. The company specialize­s in detecting bots, often used to amplify a message in the hope that it trends online.

Edward Roberts, director of product marketing at Distil Networks, says bots are becoming increasing­ly clever as they learn to evade detection by mimicking human behavior. “They are pausing on pages for random periods of time, they are clicking through at different rates, they are moving their mouse in less automated ways.”

But he says, it is good that social media platforms have realized they have a problem because they can find ways to identify bots in the same way they tag rogue messages and e-mails as spam.

“Now today, we rarely see spam, it all goes to the spam folder,” he says. “It is probably not an existentia­l threat they are dealing with.”

 ??  ??
 ??  ??

Newspapers in English

Newspapers from Philippines