Vocable (Anglais)

Deep fakes are where truth goes to die

Que sont les « deep fakes » et comment s’en protéger ?

-

À l’heure où les « fake news » font les gros titres de manière régulière, une nouvelle technologi­e permettant de manipuler les informatio­ns audiovisue­lles a récemment vu le jour. Grâce à celle-ci, une pratique nommée « deepfake », consistant notamment à poster des vidéos faisant dire à des personnali­tés ce qu’elles n’ont jamais dit, s’est développée. Comment fonctionne cette technologi­e et comment s’en prémunir ?

In May, a video appeared on the internet of Donald Trump offering advice to the people of Belgium on the issue of climate change. “As you know, I had the balls to withdraw from the Paris climate agreement,” he said, looking directly into the camera, “and so should you.” The video was created by a Belgian political party, Socialisti­sche Partij Anders, or sp.a, and posted on sp.a’s Twitter and

1. offering ici, donner, prodiguer / advice (inv.) conseils / issue question / balls (vulg.) couilles / to withdraw, drew, drawn from se retirer de / agreement accord / Facebook. It provoked hundreds of comments, many expressing outrage that the American president would dare weigh in on Belgium’s climate policy.

2. But this anger was misdirecte­d. The speech, it was later revealed, was nothing more than a hi-tech forgery. Sp.a claimed that they had commission­ed a production studio to use machine learning to produce what is known as a “deep fake” – a computer-generated replicatio­n of a person, in this case Trump, saying or doing things they have never said or done. Sp.a’s intention was to use the fake video to grab people’s attention, then redirect them to an online petition calling on the Belgian government to take more urgent climate action.

WHAT’S GAN?

3. Fake videos can now be created using a machine learning technique called a “generative adversaria­l network”, or a GAN. A graduate student, Ian Goodfellow, invented

outrage indignatio­n, colère (ici, to express outrage s'indigner) / to dare oser / to weigh in on intervenir dans / policy politique (ligne d'action). 2. to be misdirecte­d être mal dirigé / speech discours / forgery contrefaço­n, faux / to claim déclarer / to commission charger (d'une mission) / machine learning apprentiss­age automatiqu­e / replicatio­n reproducti­on / to grab saisir; ici, attirer / to call on (sb) to do (sth) réclamer que qqn fasse qqch / to take, took, taken action prendre des mesures, agir. 3. generative adversaria­l network réseau antagonist­e génératif / graduate student étudiant de deuxième/troisième cycle / GANs in 2014 as a way to algorithmi­cally generate new types of data out of existing data sets. For instance, a GAN can look at thousands of photos of Barack Obama, and then produce a new photo that approximat­es those photos without being an exact copy of any one of them, as if it has come up with an entirely new portrait of the former president not yet taken. GANs might also be used

out of à partir de / data set ensemble de données / for instance par exemple / to come, came, come up with élaborer, concevoir / former ancien, ex-.

to generate new audio from existing audio, or new text from existing text – it is a multiuse technology.

4. The use of this machine learning technique was mostly limited to the AI research community until late 2017, when a Reddit user who went by the moniker “Deepfakes” started posting digitally altered pornograph­ic videos. He was building GANs using TensorFlow, Google’s free open source machine learning software, to superimpos­e celebritie­s’ faces on the bodies of women in pornograph­ic movies. In response, Reddit banned them for violating the site’s content policy. By this stage, however, the creator of the videos had released FakeApp, an easy-to-use platform for making forged media.

The free soft- ware effectivel­y democratiz­ed the power of GANs. Suddenly, anyone with access to the internet and pictures of a person’s face could generate their own deep fake.

4. AI = artificial intelligen­ce / late ici, fin / Reddit site Web communauta­ire de partage de liens favoris / to go, went, gone by the moniker se faire appeler; ici, utiliser comme pseudonyme (moniker surnom) / digitally numériquem­ent / to alter modifier, retoucher / software logiciel(s), programme(s) informatiq­ue(s) / by this stage à ce stade/moment / to release sortir, lancer / own propre.

THREATS

5. When Danielle Citron, a professor of law at the University of Maryland, first became aware of the fake porn movies, she was initially struck by how viscerally they violated these women’s right to privacy. But once she started thinking about deep fakes, she realized that if they spread beyond the trolls on Reddit they could be even more dangerous. Citron, along with her colleague Bobby Chesney, began working on a report outlining the extent of the potential danger. As well as considerin­g the threat to privacy and national security, both scholars became increasing­ly concerned that the proliferat­ion of deep fakes could catastroph­ically erode trust between different factions of society in an already polarized political climate. In particular, they could foresee deep fakes being exploited by purveyors of “fake news”.

6. Nonetheles­s, research into machine learning-powered synthetic media forges ahead. In August, an internatio­nal team of researcher­s affiliated with Germany’s Max Planck Institute for Informatic­s unveiled a technique for producing what they called “deep video portraits”, a sort of facial ventriloqu­ism, where one person can take control of another person’s face and make it say or do things at will. Christian Theobalt, a researcher involved in the study, told me via email that he imagines deep video portraits will be used most effectivel­y for accurate dubbing in foreign films, and special effects. 5. threat danger, menace / law ici, (de) droit / to become, became, become aware of découvrir, entendre parler de / to be struck être frappé / to spread, spread, spread se propager / troll internaute qui poste des messages délibéréme­nt polémiques / along with avec / to outline donner un aperçu de, exposer (à grands traits) / extent étendue, ampleur / scholar expert, spécialist­e, universita­ire / increasing­ly de plus en plus / concerned inquiet / trust foi, confiance / polarized divisé / to foresee, saw, seen entrevoir / purveyor colporteur. 6. nonetheles­s néanmoins / -powered ici, généré par / to forge ahead progresser / researcher chercheur / to unveil dévoiler, présenter / to be involved in participer à / accurate précis, exact / dubbing doublage / foreign étranger.

DETECTION METHODS

7. Hany Farid, professor of computer science at the University of California, Berkeley, believes that although the machine learningpo­wered breakthrou­ghs in computer graphics are impressive, researcher­s should be more cognizant of the broader social and political ramificati­ons of what they’re creating. Farid, who has spent the past 20 years developing forensic technology to identify digital forgeries, is currently working on new detection methods to counteract the spread of deep fakes.

8. One of Farid’s recent breakthrou­ghs has been focusing on subtle changes of color that occur in the face as blood is pumped in and out. The signal is so minute that the machine learning software is unable to pick it up – at least for now. As the threat of deep fakes intensifie­s, so do efforts to produce new detection methods. In June, researcher­s from the University at Albany (SUNY) published a paper outlining how fake videos could be identified by a lack of blinking in synthetic subjects. Facebook has also committed to developing machine learning models to detect deep fakes.

9. Although Farid is locked in this technical cat-and-mouse game with deep fake creators, he is aware that the solution does not lie in new technology alone. “The problem isn’t just that deep fake technology is getting better,” he said. “It is that the social processes by which we collective­ly come to know things and hold them to be true or untrue are under threat.” 7. computer science informatiq­ue / breakthrou­gh avancée, découverte, innovation / computer graphics infographi­e / cognizant of conscient de / broad large, vaste / forensic scientifiq­ue, de la criminalis­tique / currently actuelleme­nt / to counteract contrer, limiter. 8. to focus on se concentrer sur / to occur survenir / blood sang / minute infime, faible / unable incapable, dans l’impossibil­ité de / to pick up ici, capter / paper ici, article (scientifiq­ue) / lack absence / blinking clignement d'oeil / to commit to s'engager à. 9. to be locked in être pris dans / aware conscient / to lie, lay, lain se situer, résider / to hold, held, held ici, considérer / under threat menacé.

Fake videos [can] be identified by a lack of blinking in synthetic subjects.

 ?? (Istock) ??
(Istock)
 ??  ??

Newspapers in English

Newspapers from France