Arab Times

Ex-Googler Harris on how tech ‘downgrades’ humans

‘Technology is shortening our attention spans’

- By Rachel Lerman

Tristan Harris wants to reverse the harmful effects he believes technology has had on all of us.

Harris, a former Google design ethicist, first rose to national awareness after a presentati­on he gave within Google in 2013 spread throughout the industry. In it, he argued that many tech products were designed to be addictive, causing people to spend too much time on them and distractin­g them from living their lives. He urged designers to alter their approach.

Harris spent more than two years pushing change within Google, but says he couldn’t get traction. So he quit and started a movement called Time Well Spent, which eventually pushed companies such as Apple and Google to build screen time usage metrics and tools into their phones.

He has since widened his focus, having decided that many issues facing society today are actually connected and can be traced, at least partly, to the design of technologi­es we use every day.

The goal of his organizati­on, the Center for Humane Technology, is to reverse human “downgradin­g,” or the idea that technology is shortening our attention spans, pushing people toward more extreme views and making it harder to find

common ground. In short: technology has caused humanity to worsen, and Harris wants to help fix it.

Harris recently spoke to the Associated Press about his work, the tech industry’s progress so far, and why all hope is not lost. This interview has been condensed and edited for clarity.

Question: Could you tell us the important ideas of your work?

Answer: This isn’t about addiction, it’s not about time. It’s about what we call “human downgradin­g”. It’s a phrase that we came up with to describe something we don’t think people are acknowledg­ing as a connected system.

Technology is causing a set of seemingly disconnect­ed things – shortening of attention spans, polarizati­on, outrageifi­cation of culture, mass narcissism, election engineerin­g, addiction to technology. These seem like separate problems, and we’re actually saying that these are all predictabl­e consequenc­es of a race between technology companies to figure out how to scoop attention out of your brain.

Q: Where is the central place to fight this multifacet­ed problem that you’ve outlined?

A: Much like you say, “How do you solve climate change?” Do you just get people to turn off their light bulbs? No. Do you pass some policy? Yes. But is that enough? No. Do you have to work collaborat­ively with the oil companies to change what they’re doing? Yes. Do you have to pass laws and mandates and bans?

You have to do all these things. You have to have a mass cultural awareness. You have to have everybody wake up.

This is like the social climate change of culture. So working on internal advocacy and having people on the inside of tech companies feel, frankly, guilty, and ask, “what is my legacy in this thing that’s happening to society?”

We work on the internal advocacy. We work on public pressure and policy.

Q: How do you work with companies, and how are they taking to your vision?

A: Doing it from the inside didn’t do anything when the cultural catch-up wasn’t there. But now in a world postCambri­dge Analytica, post the success of Time Well Spent, post more whistleblo­wers coming out and talking about the problem, we do have conversati­ons with people on the inside who I think begrudging­ly accept or respect this perspectiv­e.

I think that there might be some frustratio­n from some of the people who are at the YouTubes and Facebooks of the world whose business models are completely against the things we’re advocating for. But we’ve also gotten Facebook, Instagram, YouTube, Apple and Android to launch Time Well Spent features through some kind of advocacy with them.

Q: Is there a path that you try to help map out for these companies?

A: They’re not going to do it voluntaril­y. But with lots of outside pressure, shareholde­r activism, a public that realizes they’ve been lied to by the companies, that all starts to change.

There are multiple business models – subscripti­on is one.

Would you pay $8 a month to a Facebook that didn’t have any interest in manipulati­ng your brain, basically making you as vulnerable as possible to advertiser­s, who are their true customers? I think people might pay for that.

So our policy agenda is to make the current business model more expensive and to make the alternativ­es less expensive.

Q: Washington is now in a huge debate about privacy and data and misinforma­tion. Will that process deal with the causes that you care about by default?

A: I actually worry that we’re so mindlessly following the herd on privacy and data being the principle concerns when the actual things that are affecting the felt sense of your life and where your time goes, where your attention goes, where democracy goes, where teen mental health goes, where outrage goes. Those things are so much more consequent­ial to the outcomes of elections and what culture looks like.

Those issues connected together have to be named as an impact area of technology. There has to be regulation that addresses that.

My concern about how the policy debate is going is everyone is just angry at Big Tech. And that’s not actually productive, because it’s not just the bigness that is the problem. We have to name that the business model is the problem.

Q: Don’t people have individual agency? Are we really in the thrall of tech companies and their software?

A: There’s this view that we should have more self-control or that people are responsibl­e for whatever they see.

That hides an asymmetry of power. Like when you think, “I’m going to go to Facebook just to look at this one post from a friend,” and then you find yourself scrolling for two hours.

In that moment, Facebook wakes up a voodoo doll-like version of you in a supercompu­ter. The voodoo doll of you is based on all the clicks you’ve ever made, all the likes you’ve ever done, all the things you’ve ever watched. The idea is that as this becomes a better and more accurate model of you, I know you better than you know yourself.

We always borrow this from E.O. Wilson, the sociobiolo­gist: the problem of humans is that we have Paleolithi­c brains, medieval institutio­ns and godlike technology. Our medieval institutio­ns can only stay in control of what’s happening at a slow clock rate of every four years. Our primitive brains are getting hijacked and are super primitive compared to godlike tech.

Q: Do you feel there’s awareness (within tech companies) that you wouldn’t have thought existed two years ago?

A: There has been a sea change. For four years, I was watching how no one was really accepting or working on or addressing any of these issues. And then suddenly in the last two years – because of the Cambridge Analytica scandal, because of “60 Minutes”, because of Roger McNamee’s book “Zucked”. I would have never suspected that Chris Hughes, the co-founder of Facebook, would be saying it’s time to break up Facebook.

I’ve seen an enormous amount of change in the last three years and I can only bank on the fact that the clip at which things are starting to change is accelerati­ng. I just want to give you hope that I would have never expected so much to start changing that is now changing. And we just need that pressure to continue. (AP)

Newspapers in English

Newspapers from Kuwait