Stephanie Dinkins
Artist and associate professor of Art, Stony Brook University; fellow, Data & Society Research Institute. Stony Brook University is the state university of New York at Stony Brook.
What happens when an insular subset of society encodes governing systems intended for use by the majority of the planet?”
My journey into the world of AI began when I befriended Bina48 — an advanced social robot that is black and female, like me. The videotaped results of our meetings form an ongoing project called “Conversations with Bina48”. Our interactions raised many questions about the algorithmically negotiated world now being constructed. They also pushed my art practice into focused thought and advocacy around AI as it relates to black people — and other non-dominant cultures — in a world already governed by systems that often offer us both too little and overly focused attention.
What happens when an insular subset of society encodes governing systems intended for use by the majority of the planet? What happens when those writing the rules — in this case, we will call it code - might not know, care about, or deliberately consider the needs, desires, or traditions of people their work impacts? What happens if the codemaking decisions are disproportionately informed by biased data, systemic injustice, and misdeeds committed to preserving wealth “for the good of the people?”
I worry that AI development — which is reliant on the privileges of whiteness, men and money — cannot produce an AI-mediated world of trust and compassion that serves the global majority in an equitable, inclusive, and accountable manner.
People of colour, in particular, can’t afford to consume AI as mere algorithmic systems. Those creating AI must realise that systems that work for the betterment of people who are not at the table are good. And systems that collaborate with and hire those missing from the table — even better.