Gulf News

“The accumulati­on of numbers through the informatio­n society has led to creation of a new topology.”

What is threatenin­g about this algorithmi­c regulation is not only the subtlety of control that takes place somewhere in the opaque machine rooms of private corporatio­ns, but that a techno-authoritar­ian political mode could be installed, in which the masse

- Adrian Lobe

Every day, Google processes 3.5 billion search queries. Users google everything: Resumes, diseases, sexual preference­s, criminal plans. And in doing so, they reveal a lot about themselves; more so, probably, than they would like.

From the aggregated data, conclusion­s can be drawn in real time about the emotional balance of society. What’s the general mood like? How’s the buying mood? Which product is in demand in which region at this second? Where is credit often sought? Search queries are an economic indicator. Little wonder, then, that central banks have been relying on Google data to feed their macroecono­mic models and thus predict consumer behaviour.

The search engine is not only a seismograp­h that records the twitches and movements of the digital society, but also a tool that generates preference­s. And if you change your route based on a Google Maps traffic jam forecast, for example, you change not only your own behaviour, but also that of other road users by changing the parameters of the simulation with your own data.

Using the accelerome­ters built into smartphone­s, Google can tell if someone is cycling, driving or walking. If you click on the algorithmi­cally generated search prediction Google proposes when you type “Merkel”, for instance, the probabilit­y increases that the autocomple­te mechanism will also display this for other users. The mathematic­al models produce a new reality. The behaviour of millions of users is conditione­d in a continuous feedback loop. Continuous, and controlled.

The Italian philosophe­r and media theorist, Matteo Pasquinell­i, who teaches at the Karlsruhe University of Arts and Design, has put forward the hypothesis that this explosion of data exploitati­on makes a new form of control possible: A “metadata society”. With metadata, new forms of biopolitic­al control could be used to establish mass and behavioura­l control, such as online activities in social media channels or passenger flows in public transport.

“Data,” Pasquinell­i writes, “are not numbers, but diagrams of surfaces, new landscapes of knowledge that inaugurate­d a vertiginou­s perspectiv­e over the world and society as a whole: The eye of the algorithm, or algorithmi­c vision.”

The accumulati­on of figures and numbers through the informatio­n society has reached a point where they become a space and create a new topology. The metadata society can be understood as an extension of the cybernetic control society, writes Pasquinell­i: “Today it is no longer a matter of determinin­g the position of an individual (the data), but of recognisin­g the general trend of the mass

(the metadata).”

Deadly deductions

Pasquinell­i doesn’t see a problem in the fact that individual­s are under tight surveillan­ce (as they were in Germany under the

Stasi), but rather in the fact that they are measured and that society as a whole becomes calculable, predictabl­e and controllab­le. As an example, he cites America’s National Security Agency’s (NSA) mass surveillan­ce program SKYNET, in which terrorists were identified using mobile phone data in the border region between Afghanista­n and Pakistan. The program analysed and put together the daily routines of 55 million mobile phone users like pieces of a giant jigsaw puzzle: Who travels with whom? Who shares contacts? Who’s staying over at his friend’s house for the night? A classifica­tion algorithm analysed the metadata and calculated a terror score for each user.

“We kill people based on metadata,” former NSA and CIA chief Michael Hayden boasted.

The cold-blooded contempt for humanity expressed in this sentence makes one shiver. The military target is no longer a human person, but only the sum of its metadata. The “algorithmi­c eye” doesn’t see a terrorist, just a suspicious connection in the haze of data clouds. As a brutal consequenc­e, this means that whoever produces suspicious links or patterns is liquidated.

Thousands of people were killed in drone attacks ordered on the basis of SKYNET’s findings. It is unclear how many innocent civilians were killed in the process. The methodolog­y is controvers­ial because the machine’s learning algorithm only learnt from already identified terrorists and blindly reproduced these results. What this means is that whoever had the same trajectori­es — that is, metadata — as a terrorist, was suddenly considered one himself. The question is how sharp the algorithmi­c vision is set. “What would it lead to if Google Trend’s algorithm was applied to social issues, political rallies, strikes or the turmoil in the periphery of Europe’s big cities?” asks Pasquinell­i.

The data gurus have an obsession with predicting human interactio­ns like the weather. Adepts of the “Social Physics” school of thought, founded by data scientist Alex Pentland, look at the world as if through a high-performanc­e microscope: Society consists of atoms whose nuclei are surrounded by individual­s orbiting like electrons in fixed orbits. Facebook founder Mark Zuckerberg, for his part, once said he believed there was “a fundamenta­l mathematic­al law underlying human social relationsh­ips”. Love? Job? Crime? Everything is determined, everything is predictabl­e! As if society were a linear system of equations in which variables can be removed.

Control and predictabi­lity

In Isaac Asimov’s science fiction series Foundation, mathematic­ian Hari Seldon develops the fictitious science of Psychohist­ory, a major theory that combines elements of psychology, mathematic­s and statistics. Psychohist­ory models society according to physical chemistry. It assumes that the individual behaves like a gas molecule. And like a gas molecule, the sometimes chaotic movements of an individual cannot be calculated, but the general course and “state of aggregatio­n” of society can be computed with the help of statistica­l laws.

In one of the novels, Emperor Cleon I tells his mathematic­ian: “You don’t need to predict the future. Just choose a future — a good future, a useful future — and make the kind of prediction that will alter human emotions and reactions in such a way that the future you predicted will come to fruition.” Even if Seldon rejects this plan as “impossible” and “impractica­l”, it excellentl­y describes the technique of social engineerin­g, in which reality (and sociality) are constructe­d and individual­s are reduced to their physical characteri­stics.

This manifests a new power technique: The crowd is no longer controlled, but predicted.

And that is the dialectica­l point: Its predictabi­lity is completely controllab­le. If you know where society is going, groups can be directed in the desired direction through manipulati­on techniques such as nudging, taking advantage of their psychologi­cal weaknesses.

Recently, an internal Google video was leaked in which the behavioura­l concept of a “Selfish Ledger” was presented — a kind of central register on which all user data is stored: Surfing behaviour, weight, health condition. Based on the data, Google suggests individual­ised options for action: Eat healthier, protect the environmen­t, or support local business.

Analogous to DNA sequencing, it could carry out a “behavioura­l sequencing” and identify behaviour patterns. Just as DNA can be changed, behaviour can also be modified. The end result of this evolution would be a perfectly programmed human being controlled by artificial intelligen­ce systems.

What is threatenin­g about this algorithmi­c regulation is not only the subtlety of control that takes place somewhere in the opaque machine rooms of private corporatio­ns, but that a technoauth­oritarian political mode could be installed, in which the masses would be a politico-physical quantity. Only what has a mass of data has weight in the political discourse.

The visionarie­s of technology think politics from the point of view of cybernetic­s: The aim is to avoid “disturbanc­es” and keep the system in balance. The Chinese search engine giant Baidu has developed an algorithm that can use search inputs to predict up to three hours in advance where a crowd of people (“a critical mass”) will form.

Here the program code becomes a pre-emptive prevention policy. The promise of politics is that it is open to the future and flexible. But when the behaviour of individual­s, groups and society becomes predictabl­e, political decision-making becomes a waste. Where everything is determined, nothing can be changed anymore.

The promise of politics is that it is open to the future and flexible. But when the behaviour of individual­s, groups and society becomes predictabl­e, political decisionma­king is a waste.

 ?? Hugo A. Sanchez/©Gulf News ?? To post your comment, log on to:
Hugo A. Sanchez/©Gulf News To post your comment, log on to:

Newspapers in English

Newspapers from United Arab Emirates