Deep fakes are where truth goes to die
Que sont les « deep fakes » et comment s’en protéger ?
À l’heure où les « fake news » font les gros titres de manière régulière, une nouvelle technologie permettant de manipuler les informations audiovisuelles a récemment vu le jour. Grâce à celle-ci, une pratique nommée « deepfake », consistant notamment à poster des vidéos faisant dire à des personnalités ce qu’elles n’ont jamais dit, s’est développée. Comment fonctionne cette technologie et comment s’en prémunir ?
In May, a video appeared on the internet of Donald Trump offering advice to the people of Belgium on the issue of climate change. “As you know, I had the balls to withdraw from the Paris climate agreement,” he said, looking directly into the camera, “and so should you.” The video was created by a Belgian political party, Socialistische Partij Anders, or sp.a, and posted on sp.a’s Twitter and
1. offering ici, donner, prodiguer / advice (inv.) conseils / issue question / balls (vulg.) couilles / to withdraw, drew, drawn from se retirer de / agreement accord / Facebook. It provoked hundreds of comments, many expressing outrage that the American president would dare weigh in on Belgium’s climate policy.
2. But this anger was misdirected. The speech, it was later revealed, was nothing more than a hi-tech forgery. Sp.a claimed that they had commissioned a production studio to use machine learning to produce what is known as a “deep fake” – a computer-generated replication of a person, in this case Trump, saying or doing things they have never said or done. Sp.a’s intention was to use the fake video to grab people’s attention, then redirect them to an online petition calling on the Belgian government to take more urgent climate action.
WHAT’S GAN?
3. Fake videos can now be created using a machine learning technique called a “generative adversarial network”, or a GAN. A graduate student, Ian Goodfellow, invented
outrage indignation, colère (ici, to express outrage s'indigner) / to dare oser / to weigh in on intervenir dans / policy politique (ligne d'action). 2. to be misdirected être mal dirigé / speech discours / forgery contrefaçon, faux / to claim déclarer / to commission charger (d'une mission) / machine learning apprentissage automatique / replication reproduction / to grab saisir; ici, attirer / to call on (sb) to do (sth) réclamer que qqn fasse qqch / to take, took, taken action prendre des mesures, agir. 3. generative adversarial network réseau antagoniste génératif / graduate student étudiant de deuxième/troisième cycle / GANs in 2014 as a way to algorithmically generate new types of data out of existing data sets. For instance, a GAN can look at thousands of photos of Barack Obama, and then produce a new photo that approximates those photos without being an exact copy of any one of them, as if it has come up with an entirely new portrait of the former president not yet taken. GANs might also be used
out of à partir de / data set ensemble de données / for instance par exemple / to come, came, come up with élaborer, concevoir / former ancien, ex-.
to generate new audio from existing audio, or new text from existing text – it is a multiuse technology.
4. The use of this machine learning technique was mostly limited to the AI research community until late 2017, when a Reddit user who went by the moniker “Deepfakes” started posting digitally altered pornographic videos. He was building GANs using TensorFlow, Google’s free open source machine learning software, to superimpose celebrities’ faces on the bodies of women in pornographic movies. In response, Reddit banned them for violating the site’s content policy. By this stage, however, the creator of the videos had released FakeApp, an easy-to-use platform for making forged media.
The free soft- ware effectively democratized the power of GANs. Suddenly, anyone with access to the internet and pictures of a person’s face could generate their own deep fake.
4. AI = artificial intelligence / late ici, fin / Reddit site Web communautaire de partage de liens favoris / to go, went, gone by the moniker se faire appeler; ici, utiliser comme pseudonyme (moniker surnom) / digitally numériquement / to alter modifier, retoucher / software logiciel(s), programme(s) informatique(s) / by this stage à ce stade/moment / to release sortir, lancer / own propre.
THREATS
5. When Danielle Citron, a professor of law at the University of Maryland, first became aware of the fake porn movies, she was initially struck by how viscerally they violated these women’s right to privacy. But once she started thinking about deep fakes, she realized that if they spread beyond the trolls on Reddit they could be even more dangerous. Citron, along with her colleague Bobby Chesney, began working on a report outlining the extent of the potential danger. As well as considering the threat to privacy and national security, both scholars became increasingly concerned that the proliferation of deep fakes could catastrophically erode trust between different factions of society in an already polarized political climate. In particular, they could foresee deep fakes being exploited by purveyors of “fake news”.
6. Nonetheless, research into machine learning-powered synthetic media forges ahead. In August, an international team of researchers affiliated with Germany’s Max Planck Institute for Informatics unveiled a technique for producing what they called “deep video portraits”, a sort of facial ventriloquism, where one person can take control of another person’s face and make it say or do things at will. Christian Theobalt, a researcher involved in the study, told me via email that he imagines deep video portraits will be used most effectively for accurate dubbing in foreign films, and special effects. 5. threat danger, menace / law ici, (de) droit / to become, became, become aware of découvrir, entendre parler de / to be struck être frappé / to spread, spread, spread se propager / troll internaute qui poste des messages délibérément polémiques / along with avec / to outline donner un aperçu de, exposer (à grands traits) / extent étendue, ampleur / scholar expert, spécialiste, universitaire / increasingly de plus en plus / concerned inquiet / trust foi, confiance / polarized divisé / to foresee, saw, seen entrevoir / purveyor colporteur. 6. nonetheless néanmoins / -powered ici, généré par / to forge ahead progresser / researcher chercheur / to unveil dévoiler, présenter / to be involved in participer à / accurate précis, exact / dubbing doublage / foreign étranger.
DETECTION METHODS
7. Hany Farid, professor of computer science at the University of California, Berkeley, believes that although the machine learningpowered breakthroughs in computer graphics are impressive, researchers should be more cognizant of the broader social and political ramifications of what they’re creating. Farid, who has spent the past 20 years developing forensic technology to identify digital forgeries, is currently working on new detection methods to counteract the spread of deep fakes.
8. One of Farid’s recent breakthroughs has been focusing on subtle changes of color that occur in the face as blood is pumped in and out. The signal is so minute that the machine learning software is unable to pick it up – at least for now. As the threat of deep fakes intensifies, so do efforts to produce new detection methods. In June, researchers from the University at Albany (SUNY) published a paper outlining how fake videos could be identified by a lack of blinking in synthetic subjects. Facebook has also committed to developing machine learning models to detect deep fakes.
9. Although Farid is locked in this technical cat-and-mouse game with deep fake creators, he is aware that the solution does not lie in new technology alone. “The problem isn’t just that deep fake technology is getting better,” he said. “It is that the social processes by which we collectively come to know things and hold them to be true or untrue are under threat.” 7. computer science informatique / breakthrough avancée, découverte, innovation / computer graphics infographie / cognizant of conscient de / broad large, vaste / forensic scientifique, de la criminalistique / currently actuellement / to counteract contrer, limiter. 8. to focus on se concentrer sur / to occur survenir / blood sang / minute infime, faible / unable incapable, dans l’impossibilité de / to pick up ici, capter / paper ici, article (scientifique) / lack absence / blinking clignement d'oeil / to commit to s'engager à. 9. to be locked in être pris dans / aware conscient / to lie, lay, lain se situer, résider / to hold, held, held ici, considérer / under threat menacé.
Fake videos [can] be identified by a lack of blinking in synthetic subjects.