Khaleej Times

Why Big Data is both political and personal

- Mark Maccarthy

In the workplace, algorithms can track employees’ conversati­ons, where they eat lunch, and how much time they spend on the computer, telephone, or in meetings

Around 1200 BC, the Shang Dynasty in China developed a factory system to build thousands of huge bronze vessels for use in everyday life and ritual ceremonies. In this early example of mass production, the process of bronze casting required intricate planning and the coordinati­on of large groups of workers, each performing a separate task in precisely the right order. A similarly complex process went into fashioning the famous army of terracotta warriors that Qin Shi Huang, China’s first emperor, unveiled one thousand years later. According to the Asian Art Museum in San Francisco, the statues “were created using an assembly production system that paved the way for advances in mass production and commerce.”

Some scholars have speculated that these early forms of prescripti­ve-work technologi­es played a large role in shaping Chinese society. Among other things, they seem to have predispose­d people to accept bureaucrat­ic structures, a social philosophy emphasizin­g hierarchy, and a belief that there is a single right way of doing things.

When industrial factories were introduced in Europe in the nineteenth century, even staunch critics of capitalism such as Friedrich Engels acknowledg­ed that mass production necessitat­ed centralize­d authority, regardless of whether the economic system was capitalist or socialist. In the twentieth century, theorists such as Langdon Winner extended this line of thinking to other technologi­es. He thought that the atom bomb, for example, should be considered an “inherently political artifact,” because its “lethal properties demand that it be controlled by a centralize­d, rigidly hierarchic­al chain of command.”

Today, we can take that thinking even further. Consider machine-learning algorithms, the most important general-purpose technology in use today. Using real-world examples to mimic human cognitive capacities, these algorithms are already becoming ubiquitous in the workplace. But, to capitalize fully on these technologi­es, organizati­ons must redefine human tasks as prediction tasks, which are more suited to these algorithms’ strengths.

A key feature of machine-learning algorithms is that their performanc­e improves with more data. As a result, the use of these algorithms creates a technologi­cal momentum to treat informatio­n about people as recordable, accessible data. Like the system of mass production, they are “inherently political,” because their core functional­ity demands certain social practices and discourage­s others. In particular, machine-learning algorithms run directly counter to individual­s’ desire for personal privacy.

A system based on the public availabili­ty of informatio­n about individual community members might seem amenable to communitar­ians such as the sociologis­t Amitai Etzioni, for whom limitation­s on privacy are a means to enforce social norms. But, unlike communitar­ians, algorithms are indifferen­t to social norms. Their only concern is to make better prediction­s, by transformi­ng more and more areas of human life into data sets that can be mined.

Moreover, while the force of a technologi­cal imperative turns individual­ist Westerners into accidental communitar­ians, it also makes them more beholden to a culture of meritocrac­y based on algorithmi­c evaluation­s. Whether it is at work, in school, or even on dating apps, we have already become accustomed to having our eligibilit­y assessed by impersonal tools, which then assign us positions in a hierarchy.

To be sure, algorithmi­c assessment is not new. A generation ago, scholars such as Oscar H. Gandy warned that we were turning into a scored-and-ranked society, and demanded more accountabi­lity and redress for technology-driven mistakes. But, unlike modern machine-learning algorithms, older assessment tools were reasonably well understood. They made decisions on the basis of relevant normative and empirical factors. For example, it was no secret that accumulati­ng a lot of credit card debit could hurt one’s creditwort­hiness. By contrast, new machine-learning technologi­es plumb the depths of large data sets to find correlatio­ns that are predictive but poorly understood. In the workplace, algorithms can track employees’ conversati­ons, where they eat lunch, and how much time they spend on the computer, telephone, or in meetings. And with that data, the algorithm develops sophistica­ted models of productivi­ty that far surpass our commonsens­e intuitions. In an algorithmi­c meritocrac­y, whatever the models demand becomes the new standard of excellence.

Still, technology is not destiny. We shape it before it shapes us. Business leaders and policymake­rs can develop and deploy the technologi­es they want, according to their institutio­nal needs. It is within our power to cast privacy nets around sensitive areas of human life, to protect people from the harmful uses of data, and to require that algorithms balance predictive accuracy against other values such as fairness, accountabi­lity, and transparen­cy— Mark MacCarthy is a member of the faculty at Georgetown University. - Project Syndicate

 ??  ??

Newspapers in English

Newspapers from United Arab Emirates