Artists need protection from ‘deepfakes’
Today, it’s possible to easily find online “deepfakes” — artificial-intelligence-generated images, videos, and audio that are not grounded in reality — of most politicians and celebrities.
They could be pictured doing something criminal or humiliating, with the likelihood that some viewers will believe the images real. They could be pictured performing, again with viewers believing the images real, taking away attention and sales from their real work.
But they and their estates have little recourse to protect their reputations and income. Only thirtythree states currently recognize the statutory right of publicity, and the laws vary wildly.
Sens. Chris Coons, D-Del., Marsha Blackburn, R- Tenn., Amy Klobuchar, D- Minn. and Thom Tillis, R-N.C., recently released a discussion draft of a bipartisan bill that aims to fix this for artists, protecting them from deepfakes by holding those who produce them liable in civil lawsuits. By essentially federalizing likeness laws, the NO FAKES Act gives the original actors, singers, and public figures control over their voices and image, preventing others from releasing AI-generated content without their consent.
The NO FAKES Act is an excellent first step as we begin to navigate the murky waters of Artificial Intelligence. It helps protect the jobs of struggling voice, stage, and film actors, as well as singers, who would otherwise be competing against retired, or even deceased, public figures for jobs. It can also help prevent misinformation.
As we head into a national election, likely to be extremely divisive, it’s important that we can trust the videos and images that spread online. The proliferation of deepfakes can increase our distrust of what we see as well as increasing our trust in what was never real in the first place.
SAG-AFTRA supports the measure, with President Fran Drescher saying: “A performer’s voice and their appearance are all part of their unique essence, and it’s not ok when those are used without their permission. Consent is key.”
Without consent, deepfakes can misrepresent an artist’s values and right to self-determine. As Mr. Coons pointed out, “an AI-generated version of Tom Hanks was used in advertisements for a dental plan that he never appeared in or otherwise endorsed.” Mr. Hanks deserves to be compensated for this violation.
Deepfakes can also cause real trauma. Zelda Williams, the daughter of the late actor Robin Williams, has had to confront deepfakes of her father on the internet, without warning or consent.
She wrote: “These recreations are, at their very best, a poor facsimile of greater people, but at their worst, a horrendous Frankensteinian monster, cobbled together from the worst bits of everything this industry is, instead of what it should stand for.”
The NO FAKES Act cannot put the deepfake genie back in the bottle. And creatives should be able to push the limits of what artificial intelligence can do. They should do so, though, with careful thought as to the real damage that deepfakes can do. Allowing the artists whose likeness have been stolen to sue is only just.