Robots must learn to respect gay rights, says senior judge
HOMOPHOBIC robots could breach the Human Rights Act, a high court judge has said.
In the annual Pride Law Lecture, given at Queen’s University, Belfast this month, Dr Victoria Mccloud, Master of the Senior Court, said that computer algorithms which can tell someone’s sexuality by looking at their face could pose problems for automated decision-making in the future.
She also cited the case of Twitter chatbots – automated accounts that chat with users to learn language – which were let loose on the site in 2016 but quickly began using racist and inflammatory terms.
“If one considers the technological possibility that such online systems could identify people as gay or lesbian, and then link across to targeted decision-making, the scope for discrimination is self-evident if such a system was swayed to weight LGBT people as inherently negative or undesirable, for example in job recruitment or access to services,” she said.
Many of the rights which LGBT people rely on are in Article 8 of the EU’S Declaration of Human Rights, Dr Mccloud said, adding that while the rights were confirmed in law, the system needed to keep up with a changing society to ensure continuing protection.
“I suggest that we gain greatest protection and freedom not from the monochrome text of the law itself, but from the way in which the law is performed by those participating in it, and from ensuring that the rainbow community which we celebrate at the time of Pride is, especially, fully engaged in performance of the law,” she added.
A study published last September by Stanford University found that machines could judge sexual orientation from photographs of people’s faces.