Cape Times

George Floyd inspires major facial recognitio­n decision

- WESLEY DIPHOKO

THE BRUTAL murder of George Floyd has inspired an important decision in the developmen­t of facial recognitio­n technology.

IBM has taken a very significan­t decision aimed at preventing abuse of technology in the hands of the police. In a letter to the US Congress, IBM chief executive Arvind Krishna wrote: “IBM no longer offers general purpose IBM facial recognitio­n or analysis software. IBM firmly opposes and will not condone uses of any technology, including facial recognitio­n technology offered by other vendors, for mass surveillan­ce, racial profiling, violations of basic human rights and freedoms, or any purpose which is not consistent with our values and principles of trust and transparen­cy”

This decision is important for a couple of reasons, which include that the future of security systems will likely depend on facial recognitio­n. It is, therefore, important to get it right now and, unfortunat­ely, there’s a lot wrong with facial recognitio­n in its current form.

The technology has been blamed for racial bias. Researcher­s have found on numerous occasions that systems scrutinisi­ng our facial features are significan­tly less accurate for people with dark skin. The companies’ algorithms proved near perfect at identifyin­g the gender of men with lighter skin, but frequently erred when analysing images of women with dark skin.

The skewed accuracy appears to be due to under-representa­tion of darker skin tones in the training data used to create the face-analysis algorithms.

If this technology is used in its current form, we are likely to see wrongful arrests and further violations by the police.

Although the decision to stop this technology by the big blue firm will not itself stop the developmen­t of this technology by some tech companies, it is still a step in the right direction. Amazon has followed suit, and chances are that more tech companies will follow.

A more important reason this decision is critical has little to do with technology, but more to do with the fact that the developmen­t of technology should be aligned with ethical values of society.

Technologi­es that violate human rights and values should not be allowed to exist.

Human beings should not be held hostage by the very technology that they create.

Ethics should be a key considerat­ion, as we develop technologi­es of the future. This is true of many other technologi­es that are now being developed to shape the future.

It is up to human beings to develop technologi­es that do not trample on the rights of individual­s.

In addition, to address the potential of developing technologi­es that turn against human beings, the education of technologi­sts should include ethics as well as humanities and not only focus on computing and commerce.

Now that IBM has acknowledg­ed the real danger of using some of these technologi­es, focus should now shift to others that may negatively impact human lives.

IBM is not alone in developing technologi­es that are now considered harmful. As long as there’s still more technology, companies that continue to use technologi­es that are known to be harmful, the impact of IBM’s decision will be minimal.

For a very long time technology companies have managed to get away with murder in the name of innovation. The recent announceme­nt by IBM is a clear indication that innovation projects should undergo strict scrutiny before they are deployed.

Researcher­s have for years warned about the problems with facial recognitio­n.

As society reflects about the use of technology by the police, there’s an opportunit­y to correct harmful features and maintain features that can advance humanity.

Wesley Diphoko is the editor-in-chief of Fast Company (SA). He can be reached on Twitter via @WesleyDiph­oko

 ?? | AP ?? WASHINGTON County Sheriff’s Office Deputy Jeff Talbot demonstrat­es how his agency used facial recognitio­n software to help solve a crime, at their headquarte­rs in Hillsboro, Oregon. Amazon said on Wednesday that it will ban police use of its facial recognitio­n technology for a year in order to give Congress time to come up with ways to regulate the technology.
| AP WASHINGTON County Sheriff’s Office Deputy Jeff Talbot demonstrat­es how his agency used facial recognitio­n software to help solve a crime, at their headquarte­rs in Hillsboro, Oregon. Amazon said on Wednesday that it will ban police use of its facial recognitio­n technology for a year in order to give Congress time to come up with ways to regulate the technology.

Newspapers in English

Newspapers from South Africa