The Herald (South Africa)

The challenge of ethics in artificial intelligen­ce

- Bev Hancock is managing director at the Kamva Leadership Institute BEV HANCOCK

Iwatched a Sandra Bullock movie where her identity was completely re-engineered by sinister forces. Her new names, history and criminal record has become a reality.

With the growing capability and presence of artificial intelligen­ce-driven systems and the increasing use of data analytics to target you, who is responsibl­e for the ethical, accountabi­lity and risk implicatio­ns at a personal, community, national and global level?

The problem is that government and legislatio­n are reactive.

Facebook is finally dealing with cyber-bullying and offensive language in comments, legislatio­n and company policies are only now catching up with inappropri­ate social media behaviour and apparently you have to get an import licence if you order more than three times from an online platform.

The King III report has emphasised the importance of IT governance, but does it go far enough and are we asking the right questions?

The Bergman Klein Centre at Harvard is asking: “How do we ensure that AI systems serve the public good rather than exacerbate existing inequaliti­es and biases?”

Is it time that global code of ethics become similar to environmen­tal and other treaties?

The three mega trends impacting AI are physical, digital and biological.

The developmen­t of robotics, 3D printing, autonomous cars, drones and virtual reality equipment are heralded as disruptive technologi­es.

However, it has serious implicatio­ns for the future of labour, safety and ethical decision-making, crossing geopolitic­al boundaries and the social bias of navigating through unmapped informal communitie­s.

The use of sensors, DNA mapping, brain-scanning in the growing field of neuroscien­ce, implanted chips, CCTV cameras and GPS tracking systems are crossing the physical and psychologi­cal boundaries of personal space and privacy.

It is collecting vast amounts of personal data which are inevitably owned by the collector.

If you dislike a medical-aid committee making decisions about your healthcare, what if it can make these decisions based on urine samples taken from company bathrooms (yes, the technology exists) or suggest gene therapy based on your DNA profile?

What if these decisions were being made by a machine and not a human?

The Edelman Trust monitors place political, media and business trust at an all time low.

Can we trust them to rise above self-interest to communicat­e, legislate and provide governance frameworks on our behalf?

Ultimately, however, ethics, accountabi­lity and governance can only truly work if the locus of control is internal and not external.

It starts with us. It is about questions we ask before we build AI technologi­es.

Are we protecting personal data and privacy through technologi­es like blockchain, do we have clear AI governance and IT policies, are we building digital literacy into our skills plans and are we rewarding the right behaviour?

And, most importantl­y, are we consciousl­y safeguardi­ng our humanity?

Nanny software or a DStv code does not replace parental attention.

Digital platforms supplement human connection, but do not replace it and an emoji hug is never the same.

Digital ethics start at home.

Are you walking the talk with your children, your employees, your customers?

Are you preparing them to not only thrive now, but also into the future?

Ethics is not the sole responsibi­lity of government and leadership, it is everyone’s responsibi­lity – our very future depends on it.

 ??  ??

Newspapers in English

Newspapers from South Africa