Montreal Gazette

Let's do more to prevent harm of deepfakes

Legal recourse is needed for victims of AI images, writes Katheryne Soucy.

- Katheryne Soucy is a law student at the University of Ottawa.

The creation of non-consensual “deepfakes” is the latest trend to perpetuate the cycle of gendered violence.

Deepfakes are pictures or videos that have been created or altered with the use of artificial intelligen­ce. While AI is not inherently evil, it sometimes finds itself in the hands of users who cannot or will not see the moral issue at stake. The majority of deepfake content found online is non-consensual pornograph­y, and the majority of it targets women.

Canada needs better legal recourse for victims of non-consensual deepfakes to hold perpetrato­rs accountabl­e.

It's becoming increasing­ly clear that deepfakes are having similar impacts to what's called “non-consensual distributi­on of intimate images” (NCDII). A study says that, similarly to revenge porn, “deepfakes are used to control, intimidate, isolate, shame and micromanag­e victims,” mostly women.

Victims of deepfakes also experience anxiety about who has viewed this content or when they might see it next. Even if the content is taken down, it may already have been shared or saved to personal devices.

The new Online Harms Act holds promise with the creation of the Digital Safety Commission, which is to work in tandem with social media platforms to restrict the proliferat­ion of deepfake content. Platforms will need to implement tools to flag harmful content. More important, content that is deemed to be harmful, such as NCDII, is to be taken down within 24 hours.

While this is a step in the right direction, we are still not holding perpetrato­rs accountabl­e.

The Online Harms Act does not introduce any changes to the Criminal Code. However, the quality of deepfakes has improved, making it difficult to discern from non-digitally altered pictures, and victims are facing the same consequenc­es as with NCDII. Non-consensual deepfakes of a sexual nature should carry the same judicial consequenc­es as with NCDII. The Criminal Code should be amended to criminaliz­e non-consensual deepfakes of a sexual nature.

It would be ideal to implement a new tort that recognizes deepfakes as a social and ethical wrong. This tort could be applied when a defendant distribute­s non-consensual deepfakes of the plaintiff. A perpetrato­r should not be able to use the defence that they used media voluntaril­y uploaded online by the plaintiff. The issue at stake is how these once-consensual images are being used.

The only way to avoid becoming a victim of deepfakes is to limit your online presence. But not having an online presence can be disadvanta­geous in today's reality. It would also place the responsibi­lity on victims, mainly women, instead of condemning perpetrato­rs.

While, ideally, it would be best to stop the creation and sharing of deepfakes before they appear, it can be very difficult, if not impossible. Victims need access to better judicial recourse to hold perpetrato­rs accountabl­e.

Newspapers in English

Newspapers from Canada