How prepared are we for deepfakes? Researchers call for shift in AI to protect women
In the picture, a blond woman in a bikini stands on the beach. A line then flashes across the screen, exposing her nude figure.
"Use undress AI to deep‐ nude girl for free!" reads the description on the site.
Although it says consent is required, it only takes a few clicks to upload an image and see the person in it un‐ dressed.
Since last summer, the number of sites with publicly available AI image tools have multiplied and gained mil‐ lions of views, and cases of AI-doctored photos of under‐ age girls have already been shared by high school stu‐ dents in London, Ont., and Winnipeg. No charges have been laid in either case.
But abuse of the tech‐ nology has been prosecuted in Quebec. Last year, a man from Sherbrooke in the East‐ ern Townships was sen‐ tenced to three years in prison for creating at least seven deepfake videos de‐ picting child pornography.
Quebec, like the rest of the country, may not be pre‐ pared to deal with this ascen‐ dant AI technology, according to intellectual property lawyer Gaspard Petit.
And as Ottawa plays catch-up in regulating harm‐ ful content on the internet, researchers are calling for greater diversity and trans‐ parency to stop women from being targeted by the tech‐ nology without their consent.
Petit says he has been taking a closer look at the de‐ velopment of AI technology as it continues to evolve.
"I think there's a general consensus that in Quebec, we're not quite prepared - in Canada as a whole," he said.
According to Gaspard, protections in the Quebec charter and laws already ex‐ ist to protect people's privacy and reputation.
He says nude deepfake cases can fall into a legal grey zone in which it's not always clear if it's possible to crimi‐ nally prosecute a person who produces or distributes them - something he says Canadi‐ an legislators are debating how to improve.
One problem, Gaspard says, is that the onus falls on the victim to prove they have been harmed and who is re‐ sponsible and then, if they have the means, sue.
But he says the bigger is‐ sue is preventing the creation of distribution of the images in the first place.
Fixing the gender dis‐ parity
Dongyan Lin, a researcher at MILA, a Montreal-based arti‐ ficial intelligence institute, studies the link between neu‐ roscience and AI. She says these deepfakes are a "great example of not having women in the decision-mak‐ ing process."
As a result, she says there are blind spots at these com‐ panies in thinking about how the technology would be used "once it's massively commercialized."
Affecting Machines, a project developed at Con‐ cordia University, tries to bridge the gender gap in AI and STEM by promoting the work of women in the field.
Lindsay Rogers, knowl‐ edge mobilization advisor at Concordia's Applied AI Insti‐ tute, is one of the people in‐ volved in the project.
"Gender diversity is really fundamental for having AI systems that are representa‐ tive of the populations that use them," she said.
"It's not just about the numbers like AI [labs] hiring more women or non-binary folks in a room, it's about cre‐ ating a culture and an atmos‐ phere where they can suc‐ ceed and do well and be‐ come valued members of the team," she said, putting the percentage of women work‐ ing in the field in tech at around a quarter, barely creeping up in the past two decades.
Ethics training, other so‐ lutions
Along with stricter regula‐ tions and public hearings on AI use, Lin says mandatory ethics training would help AI developers gain a broader understanding of how the technology could be used by the public.
Banning sites that use deepfake technology is also an option, but experts like Sasha Luccioni, a Montrealbased research scientist at AI company Hugging Face, points to tools that allow users to skirt bans in the country they're based.
Other technical solutions like making images unusable by AI models are also on the table, but none of these solu‐ tions address the problem at its core, says Luccioni.
The root of the problem, she says, is how people de‐ cide to use the available tech‐ nology - including using it to objectify women's bodies.
For that problem, she says the solution is educating the public and raising awareness.