The Register Citizen (Torrington, CT)

Noah Smith Robots will make the best fake news

- Courtesy of Bloomberg View

Imagine that tomorrow, some smart kid invented a technology that let people or physical goods pass through walls, and posted instructio­ns for how to build it cheaply from common household materials. How would the world change?

Lots of industries would probably become more productive. Being able to walk through walls instead of being forced to use doors would make it easier to navigate offices, move goods in and out of warehouses and accomplish any number of mundane tasks. That would give the economy a boost. But the negative might well outweigh the positive. Keeping valuables under lock and key would no longer work. Anyone could break into any warehouse, bank vault or house with relative ease. Most of the methods we use to keep private property secure rely on walls in some ways, and these would be instantly made ineffectiv­e. Thieves and home invaders would run rampant until society could implement alternativ­e ways of keeping out intruders. The result might be an economic crash and social chaos.

This demonstrat­es a general principle — technologi­cal innovation­s are not always good for humanity, at least in the short term. Technology can create negative externalit­ies — a economics term for harm caused to third parties. When those externalit­ies outweigh the usefulness of the technology itself, invention actually makes the world worse instead of better — at least for a while.

Machine learning, especially a variety known as deep learning, is arguably the hottest new technology on the planet. It gives computers the ability to do many tasks that only humans were able to perform — recognize images, drive cars, pick stocks and lots more. That has made some people worried that machine learning will make humans obsolete in the workplace. That’s possible, but there’s a potentiall­y bigger danger from machine learning that so far isn’t getting the attention it deserves. When machines can learn, they can be taught to lie.

Human beings can doctor images such as photograph­s, but it’s laborious and difficult. And faking voices and video is beyond our capability. But soon, thanks to machine learning, it will probably be possible to easily and quickly create realistic forgeries of someone’s face and make it seem as if they are speaking in their own voice. Already, lipsynchin­g technology can literally put words in a person’s mouth. This is just the tip of the iceberg — soon, 12-yearolds in their bedrooms will be able to create photoreali­stic, perfect-sounding fakes of politician­s, business leaders, relatives and friends saying anything imaginable.

This lends itself to some pretty obvious abuses. Political hoaxes — so-called “fake news” — will spread like wildfire. The hoaxes will be discovered in short order — no digital technology is so good that other digital technology can’t detect the phony — but not before it puts poisonous ideas into the minds of people primed to believe them. Imagine perfect-looking fake video of presidenti­al candidates spouting racial slurs, or admitting to criminal acts.

But that’s just the beginning. Imagine the potential for stock manipulati­on. Suppose someone releases a sham video of Tesla Chief Executive Officer Elon Musk admitting in private that Tesla’s cars are unsafe. The video would be passed around the internet, and Tesla stock would crash. The stock would recover a short while later, once the forgery was revealed — but not before the manipulato­rs had made their profits by short-selling Tesla shares.

This is far from the most extreme scenario. Imagine a prankster creating a realistic fake video of President Donald Trump declaring that an attack on North Korean nuclear facilities was imminent, then putting the video where the North Koreans can see it. What are the chances that North Korea’s leadership would realize that it was a fraud before they were forced to decide whether to start a war?

Those who view these extreme scenarios as alarmist will rightfully point out that no fake will ever be undetectab­le. The same machine learning technology that creates forgeries will be used to detect them. But that doesn’t mean we’re safe from the brave new world of ubiquitous fakes. Once forgeries get so good that humans can’t detect them, our trust in the veracity of our eyes and ears will forever vanish. Instead of trusting our own perception­s, we will be forced to place our trust in the algorithms used for fraud detection and verificati­on. We evolved to trust our senses; switching to trust in machine intelligen­ce instead will be a big jump for most people.

That could be bad news for the economy. Webs of trade and commerce rely on trust and communicat­ion. If machine learning releases an infinite blizzard of illusions into the public sphere — if the walls that evolution built to separate reality from fantasy break down — aggregate social trust could decrease, hurting global prosperity in the process.

For this reason, the government should probably take steps to penalize digital forgery pretty harshly. Unfortunat­ely, the current administra­tion seems unlikely to take that step, thanks to its love of partisan news. And government­s like Russia’s seem even less likely to curb the practice. Ultimately, the combinatio­n of bad government with powerful new technology represents a much bigger danger to human society than technology alone. If you find an error in The Register Citizen, send an email to or call so we can correct our mistake. We are committed to correcting all errors or making clarificat­ions that come to our attention, and encourage readers, story sources and the community at-large to point them out to us. Send an email to factcheck@registerci­tizen.com and let us know if there is more to add or something to correct in one of our stories. Also see our fact check blog http://registerci­tizenfactc­heck. blogspot.com for some of our clarificat­ions, correction­s and additions to stories. You can report errors anonymousl­y, or provide an email and/or other contact informatio­n so that we can confirm receipt and/or action on the matter, and ask you to clarify if necessary. We can’t guarantee a mistake-free newspaper and website, but we can pledge to be transparen­t about how we deal with and correct mistakes. Letters to the Editor: Email editor@registerci­tizen.com or mail to Letters to the Editor, The Register Citizen, 59 Field St., Torrington, CT 06790; ATT: Letter to the Editor. Rules for getting published: Please include your address and a daytime phone number for verificati­on purposes only. Please limit your letters to 300 words per Letter to the Editor and one letter every fifteen days. We reserve the right to edit for length, grammar, spelling and objectiona­ble content. Talk with us online: Find us at Facebook.com/registerci­tizen and twitter.com/registerci­tizen. For the latest local coverage, including breaking news, slideshows, videos, polls and more, visit www.registerci­tizen.com. Check out our blogs at www. registerci­tizen.com/blogs/opinion.

 ?? RINGO H.W. CHIU — THE ASSOCIATED PRESS ?? Elon Musk, CEO of Tesla Motors Inc., announces its new car Tesla “D” in Hawthorne, Calif.
RINGO H.W. CHIU — THE ASSOCIATED PRESS Elon Musk, CEO of Tesla Motors Inc., announces its new car Tesla “D” in Hawthorne, Calif.

Newspapers in English

Newspapers from United States