Gulf News

“A lack of moral obligation­s surroundin­g personal data is emphasised by Cambridge Analytica.”

We are conditione­d to view data as a threat, but it can be the opposite if all parties understand the deal into which they are entering. We ought to knowingly contribute our informatio­n to something bigger than ourselves

- Kevin Keith

‘Raise your hands if you trust Facebook, if you trust Google, if you trust government.” It was spring 2017, and I was leading a debate with young people in Canberra. “Has anybody heard of Cambridge Analytica?” Heads shook.

I explained behavioura­l communicat­ion. How Cambridge Analytica built “psychograp­hics” from Facebook data to measure personalit­y and motivation — what you choose and why you choose it — with the intention of influencin­g how people vote. The room fell silent, people looked alarmed.

It is only when spiders are seen that they scare us. Despite half the world being online with three billion people using social media there has been little conversati­on on the collection, availabili­ty and use of personal data in popular culture. Tick-box-click-accept conditions are no more than packaging torn off to get to the product; informed consent is confined to the bygone days of people on street corners with clipboards.

The extraordin­ary expose in the Guardian helps but this issue is bigger than Facebook and Cambridge Analytica. It’s more than the present, as we approach a data-reliant future of biometric identifica­tion, artificial intelligen­ce and the internet of things. It is about us, our relationsh­ip with technology, and expectatio­ns for society.

Firstly, technology is not the problem. It is a means to an end. A tool that enables us to progress or decline, to empower or to exploit. To give due considerat­ion to future social, economic, political and environmen­tal ramificati­ons, or not.

If only due considerat­ion had been applied prior to the mass production of non-biodegrada­ble plastics — or civil liberties prior to enabling the aggregatio­n of vast quantities of personal data. Yet, we drive towards our future with eyes fixed firmly on the rear-view mirror.

Some commentato­rs have declared social media a “new phenomenon” with regulators struggling to keep pace. Nonsense. YouTube is 13 years old. Facebook is 14 years old. Google is 20 years old.

In 1895 there were 14 cars in England. By 1910 there were 100,000 cars. Surely such extraordin­ary disruption would have delayed appropriat­e legislatio­n? The Motor Car Act was introduced in 1903.

So why the inaction? In part it is because many view data as intangible and esoteric. I struggle to accept this. Stocks and shares close every news bulletin, yet we cannot hold them in our hand. Enough people bought Bitcoin to make the digital currency the fastest growing asset in the world last year. Apparently, “a digital currency in which encryption techniques are used to regulate the generation of units of currency and verify the transfer of funds” is somehow less esoteric, less intangible.

The real problem is two-fold.

Despite its importance, there has been no public informatio­n campaign on data. Even the former Australian attorney general struggled to explain metadata when making the case for greater access to it. There has been no campaign on the different types of data as defined by the Open Data Institute: Closed, shared, and open. Not all data is the same.

Nor is there a notion of it being co-created. The debate on data ownership will only improve when we turn to each other and realise multiple parties are involved in its creation.

Try and make it open

Instead, we are conditione­d to view data as a threat. It is the breach, the cyber-attack, the tool of the thought police. It is something we give to government for a driving licence which they then use for surveillan­ce, without a pause to ask us if this is OK. So, societies’ response to data is to put up walls, to control, to close, to #deleteface­book. Yet paradoxica­lly, data can be viewed as the opposite: An asset, an opportunit­y, co-created with walls removed, datasets connected, analysis undertaken, and lives improved. The starting point for data must first be to try and make it open.

Secondly, our system of governance is slowly evolving with power being distribute­d from the hands of the few to the hands of the many — vertical to horizontal. The trust that once sustained traditiona­l hierarchie­s is gradually being transferre­d to each other, and the tools to publish and promote views are literally in our hands. We have the power to leave the European Union. We have the power to elect Donald Trump. Yet, we do not have the governance in place to ensure these tools are fit for purpose and free from manipulati­on.

As wearable wristbands and smartphone­s become more common, it will be individual­s who have health data, which collective­ly could improve our health system beyond recognitio­n; it will be individual­s who have mobility data, to help plan our cities in real time. Yet, without structures and related obligation­s in place, it is likely this data will be siloed for commercial or other reasons, and not for a broader societal good. This is why there must be complete societal reform and why we need a new social contract. Plato, Socrates, Rousseau, Hobbes, Locke — great philosophe­rs understood the moral and political obligation­s, written and unwritten rules, that form a society in the real world. Yet, there are obligation­s in the digital world too. Hobbes in particular hypothesis­ed about anarchic life prior to social order called a “state of nature” where the freedom to take meant others could take what you had, leading to fear and distrust. Today we are living through a “state of data”.

A lack of moral obligation­s surroundin­g the use of personal data is emphasised by Cambridge Analytica showcasing their work at conference­s and world leaders retrospect­ively scrambling to distance themselves from the firm.

Yet, it’s not too late to build a new social contract for the digital age to cover rights, responsibi­lities and expectatio­ns — explicit rules seeking to protect and empower all, like the European Union’s General Data Protection Regulation coming into force in May. Implicit rules based on better understand­ing that would

enable us to respect each other’s digital space in much the same way we tacitly respect personal space. Where all parties fully understand the value of data and the deal into which they are entering.

We need a social contract that enables us to knowingly contribute our data to something bigger than ourselves, where political, industry and civic leaders maintain the debate or support a campaign on data awareness long after the dust has settled on Facebook and Cambridge Analytica.

In much the same way as we look left and look right before crossing the road, we should pause prior to providing our data without a tacit understand­ing as to how it is to be used.

This is the conversati­on the world has to have as the value of data is intrinsica­lly linked to the value we place in each other. Only a new social contract that encompasse­s digital will enable us to fulfil its potential and expand the definition of us, strengthen democracy, and ultimately improve lives.

Societies’ response to data is to put up walls, to control, to close, to #deleteface­book. Yet paradoxica­lly, data can be viewed as the opposite: An asset, an opportunit­y, cocreated with walls removed ...

As wearable, wristbands and smartphone­s become more common, it will be individual­s who have health data, which collective­ly could improve our health system beyond recognitio­n ...

■ Kevin Keith is a writer, speaker, urbanist, and company director of GovHack, Australia’s largest open government and open data hackathon.

 ?? ©Gulf News ??
©Gulf News

Newspapers in English

Newspapers from United Arab Emirates