Can artificial intelligence be trusted with our human rights?
1 See Dominique Allen, ‘Anti-discrimination law: Forty years on, how far have we come?' (24 April 2018), Impact.
2 See, eg, Michael Hiscox et al, Going blind to see more clearly: unconscious bias in Australian Public Service shortlisting processes (2017), Australian Government Department of the Prime Minister and Cabinet.
3 Danziger et al, ‘Extraneous factors in judicial decisions' (2011) 108(17) Proceedings of the National Academy of Sciences 6889. 4 See, eg, Andreas Glöckner, ‘ The irrational hungry judge effect revisited: Simulations reveal that the magnitude of the effect is overestimated' (2016) 11(6) Judgment and Decision Making 601. 5 Klaus Schwab, The Fourth Industrial Revolution (2017).
6 See, eg, Buolamwini and Gebru, ‘Gender shades: Intersectional accuracy disparities in commercial gender classification' (2018) 81(1) Proceedings of Machine Learning Research 1.
7 See Fussey and Murray, ‘Independent Report of the London Metropolitan Police Service's Trial of Live Face Recogntion Technology', The Human Rights, Big Data and Technology Project (July 2019), available at http://repository.essex.ac.uk/24946/1/ London-met-police-trial-of-facial-recognition-tech-report-2.pdf 8 R (Bridges) v CC South Wales [2020] EWCA Civ 1058. 9 Parliamentary Joint Committee on Intelligence and Security,
Advisory report on the Identity-matching Services Bill 2019 and the Australian Passports Amendment (Identity-matching Services) Bill 2019, Parliament of Australian, 2019.
10 See, eg, Biel et al, ‘Facetube: Predicting personality from facial expressions of emotion in online conversational video' (2012) Proceedings of the 14th ACM International Conference on Multimodal Interaction 53.
11 Wang and Kosinski, ‘Deep neural networks are more accurate than humans at detecting sexual orientation from facial images (2018) 114(2) Journal of Personality and Social Psychology 246.