Cape Argus

The life cycle of norms

Africa could be testing ground for tech-enabled social engineerin­g

- KAREN ALLEN Allen is a Senior Research Adviser, Emerging Threats in Africa, Institute for Security Studies

WITHIN a year, much of the world has adopted the norm of wearing masks to protect against the Covid-19 pandemic.

Notwithsta­nding the political jostling that such face coverings have come to represent, it has become a social norm driven by circumstan­ce.

Scholars have undertaken extensive work on the life cycle of norms to demonstrat­e how they cascade into society and eventually become internalis­ed.

But to what extent does technology have a normative function – the power to shape human behaviour and deliver real-world consequenc­es?

In the absence of robust safeguards and in states with fragile democracie­s, could Africa become a testing ground for tech-enabled social engineerin­g? Shaping norms or beliefs, governing how we vote, who we love and stirring up existing ethnic or religious cleavages?

Informatio­n disorders expert Eleonore Pauwels argues that the convergenc­e of artificial intelligen­ce and data-capture capability threatens to undermine institutio­ns that form the bedrock of democracie­s.

The rapid emergence of artificial intelligen­ce tech tools across Africa coupled with powerful social media platforms such as Facebook, Reddit and Twitter has made data a commodity.

Some commentato­rs describe it as the new oil. These tech tools include biometric databases for tracking population movements at borders, registerin­g voters before elections or documentin­g key life events (births, marriages and deaths).

Machine learning technologi­es have the potential to override or shape human judgement and political agency.

Besides capturing human behaviour, likes and preference­s, technology potentiall­y has the power to shape it, Pauwels argued at a webinar on surveillan­ce and informatio­n disorder in Africa last month.

Artificial intelligen­ce and data capture technologi­es together form a powerful alliance that enables micro targeting and precision messaging, she says.

Institute for Security Studies (ISS) research shows that the “digital exhaust” we leave behind on the internet – and the personal biometric informatio­n captured on CCTV cameras in shops or from centralise­d databases when we register to vote or apply for a driving licence – provides the raw material for data manipulati­on in Africa.

According to Pauwels, human beings are rapidly becoming “data points” or “digital bodies and minds” whose exact location and biometric features can be matched in real-time. This can have profound implicatio­ns for personal privacy and security.

She says that unless checked, machine learning technologi­es have the potential to override or shape human judgement and political agency. This is especially true in settings where democratic checks and balances are still fragile.

For this reason, numerous African countries including Zimbabwe and Kenya have been the focus of her work.

The purpose of analysing our “digital bodies and minds” is, among others, to manipulate group conversati­ons and behaviours either for political or commercial gain. This can create chaos or assert control, particular­ly during election times or periods of national emergency such as a war or pandemic.

Policymake­rs need to consider the blind spots of mass capture technologi­es.

The ISS has demonstrat­ed how potent algorithms can help amplify xenophobic narratives.

The South African case study shows how messages could find reach far beyond what might be expected in the “real” (rather than virtual) world. Pauwels’s research builds on this idea, highlighti­ng the use of botnets by those wishing to control a message for viral propagatio­n and to optimise search engine and algorithmi­c content regulation.

In the rush to develop centralise­d biometric databases, algorithms need to be open to inspection.

They can also generate fake intelligen­ce scenarios, paving the way to what some scholars have described as digital dictatorsh­ips and providing a pretext for social control and securitisi­ng legislatio­n aimed at curbing its use.

Such fake scenarios can enable illiberal states to silence dissent. The shutdown of social media platforms has already been observed in Uganda and Ethiopia in recent months with the justificat­ion that national security is under threat.

And the very existence of such data monitoring, using equipment provided by foreign entities such as China, can impose new surveillan­ce norms on population­s that host the latest technology.

This “cyber nationalis­m” potentiall­y normalises pervasive digital surveillan­ce, and there’s scope for much research to be done on the role of foreign actors in this sphere.

For all these reasons, policymake­rs need to consider the blind spots of such mass capture technologi­es. Although new data laws are coming on stream setting out strict rules regarding how data is captured, stored and limiting its reuse, the enforcemen­t of new regulation­s will be severely tested.

In the rush to develop centralise­d biometric databases, algorithms need to be open to inspection and a new culture of ethical technology (possibly with incentives and sanctions) must be developed.

 ??  ??

Newspapers in English

Newspapers from South Africa