Rise of the evil bot — be afraid. Very afraid
AS an experiment, Tunde Olanrewaju messed around one day with the Wikipedia entry of his employer, McKinsey. He edited the page to say that he had founded the consultancy firm. A friend took a screenshot to preserve the revised record.
Within minutes, Olanrewaju received an e-mail from Wikipedia saying that his edit had been rejected and that the true founder’s name had been restored. Almost certainly, one of Wikipedia’s computer bots that police the site’s 40 million articles had spotted and corrected his entry.
It is reassuring to know that an army of such clever algorithms is patrolling the frontline of truthfulness — and can outsmart a senior partner in McKinsey’s digital practice. In 2014, bots made about 15% of all edits on Wikipedia.
But algorithms can be used for offence. And sometimes they interact with each other in unintended ways.
The need to understand such interactions is becoming urgent as algorithms become central in areas as varied as social media, financial markets, cybersecurity, autonomous weapons systems and networks of self-driving cars.
A study published last month in the research journal Plos One, analysing the use of bots on Wikipedia over a decade, found that even those designed for benign purposes could spend years duelling with each other.
In one such battle, Xqbot and Darknessbot disputed 3 629 entries, undoing and correcting the other’s edits on subjects ranging from Alexander the Great to Aston Villa football club.
The authors, from the Oxford Internet Institute and the Alan Turing Institute, concluded that “we know very little about the life and evolution of our digital minions”.
Wikipedia’s bot ecosystem is gated and monitored. But in many other reaches of the internet malevolent bots, often working in collaborative botnets, can run wild.
The authors highlighted the dangers of such bots mimicking humans on social media to “spread political propaganda or influence public discourse” and European experts have questioned whether democracy can survive the era of big data and artificial intelligence.
Is truth, in some senses, being electronically determined?
Are we becoming the “digital slaves” of our one-time “digital minions”? The scale, speed and efficiency of some of these algorithmic interactions are beyond human comprehension.
Psychologist Susan Blackmore has argued that, by creating such computer algorithms, we may have unleashed a “third replicator”, which she originally called a teme, later modified to treme.
The first replicators were genes that determined our biological evolution.
The second were human memes, such as language, writing and money, that accelerated cultural evolution. But now, she believes, our memes are being superseded by nonhuman tremes, which fit her definition of a replicator as being “information that can be copied with variation and selection”.
“Humans are being transformed by new technologies,” she said in a recent lecture. “We have let loose the most phenomenal power.”
We can be manipulated as individuals and groups
For the moment, Blackmore’s theory remains on the fringes of academic debate.
Tremes may be an interesting concept, says Stephen Roberts, professor of machine learning at the University of Oxford, but he does not think we have lost control. “There would be a lot of negative consequences of artificial intelligence algos getting out of hand,” he says. “But we are a long way from that right now.”
The more immediate concern is that political and commercial interests have learnt to “hack society”, as he puts it. “Falsehoods can be replicated as easily as truth. We can be manipulated as individuals and groups.”
His solution? To establish the knowledge equivalent of the Millennium Seed Bank, which aims to preserve plant life at risk from extinction. “As we de-speciate the world, we are trying to preserve these species’ DNA. As truth becomes endangered, we have the same obligation to record facts.”
But, as we have seen with Wikipedia, that is not always such a simple task. — © The Financial Times Ltd
Samantha Enslin-Payne is away