Ben Mack and Elly Strang look at the consequences of the digital revolution
When the World Wide Web was first ushered into existence by founder Tim Berners-Lee in 1991, he and many others envisioned it as the dawn of a new era. For the first time in human history, people would be more connected than ever, while information would be free from corporate and government powers and democratically accessible to all. More than 20 years later, the world and everyone we know within it can be found at our fingertips. Of course, technology i s much bigger than the i nternet, but this development has arguably been the most i nfluential i n terms of i ts i mpact on society i n recent decades. And while there have been numerous positives, his vision for an egalitarian free-for- all hasn’t quite panned out the way he’d hoped. From the i nfluence of algorithms run by companies that hold enormous power, to the i ncreasing threat of cybercrime as more devices connect to the i nternet, to the mental health of tech entrepreneurs, to signs of digital addiction among the general populace, to the perils of cyber bullying, to the l ack of diversity i n tech,
Ben Mack and Elly Strang give the digital revolution a reality check.
For many years,
scientists were under the impression that humans could only develop an addiction for alcohol or drugs. This is understandable, as substance abuse is easy to identify : there are well-documented cases of people winding up sick, broke or dead.
However, recent research has revealed that other activities give people a hit of dopamine right in the pleasure centre that’s associated with addiction. Case in point: the ping of your phone when you get a notification lights up the same area of the brain as when you take cocaine.
Psychologist Adam Alter, author of Irresistible: The Rise of Addictive Technology
and the Business of Keeping Us Hooked, says addictive technology behaviours are so entrenched in society, we barely notice it, and yet most surveys are unanimous: a vast chunk of the population is hooked on their devices.
A study from the University of Hong Kong in 2014 estimated that 420 million people around the world are addicted to the internet. In 2017, this number is likely to have shot up even more.
Alter says the difference with technology addiction is there hasn’t been much oversight into the way it has immersed itself so heavily into our daily lives, or the ramifications of this.
“We focus so much on its obvious upsides, which have profoundly disrupted our lives,” Alter says. “Many of its downsides have crept up. For example, email began as a low-level way of communicating from time to time, but now workers in many cultures feel tethered to email 24 hours a day. That didn’t happen overnight, which is one reason why people haven’t paid as much attention to it as a downside.”
And although substance addiction is far more likely to kill you, Alter says both substance and behavioural addictions share many of the same traits.
“They influence the brain in similar ways (though more strongly for substances) and they both treat psychological needs that aren’t met otherwise, including boredom, anxiety, loneliness and depression,” he says.
Perhaps the most telling sign of this is that the late Steve Jobs, who invented the iPad, didn’t actually allow his kids to use the device he had created.
“We limit how much technology our kids use in the home,” he told The New York Times.
Alter says this is akin to a drug dealer’s ‘don’t get high on your own supply’ mentality.
“[Jobs] recognised that children and teens struggle to interact with other people, to do their homework, and to generally avoid using screens when those screens are in front of them. The iPad, with all its captivating content, is especially difficult to resist,” he says.
Humans are naturally inclined to crave this hit of pleasure – be it from another person, an illicit substance or an object. But the more ominous side to technology is the fact that on the other side of these devices and apps, a human is handcrafting its features to be as addictive as possible.
On Netflix, the next episode is lined up to play automatically unless you tell it to stop. The level of difficulty to continue is less than zero, as it seems easier to keep binge watching an entire season of Orange is The New Black than to pull yourself out of the vortex.
On Snapchat, streaks appear depending on how many pictures two people have been sending back and forth to one another. If one person breaks the communication, the streak will end, encouraging addictive, frequent use of the app.
Alter says digital product designers intentionally build these rewards, such as likes, reposts, comments and shares, to be addictive. And in perhaps one of the most cleverly executed behavioural designs the world has seen, the trigger for this hit is not actually the technology itself, but other people. This means friends or followers are constantly prompting a person to continue using the service, and so the cycle continues.
“The possibility of these rewards is hard to resist in the same way that playing slot machines, with the promise of monetary rewards, is hard to resist,” he says. “They also create artificial goals – reach 1000 followers! Reach 100 likes! Conquer all 300 levels of this game! Which humans struggle to ignore once they exist.”
But it’s important to note technology isn’t all ominous, either. Just as alcohol can be enjoyed responsibly, so too can technology. And, when it comes down to it, Alter says it all depends on how it’s commoditised by companies and consumed by individuals.
Those feeling addicted or overwhelmed just need to actively monitor their behaviour and tr y cut down, he says. He recommends picking a certain time of day, like dinner time, to stop using devices with screens.
“In my experience, people enjoy this sacred period of time to such an extent that they extend that brief tech-free period to cover weekends and nights more generally.”
2016 was arguably the year social media turned sour, harnessed by the forces of darkness to subvert democracy and free and open expression.
But we should have seen this coming – at least, if we broke out of our “filter bubbles”, a phrase coined by Upworthy co-founder Eli Pariser to describe the way technology companies feed us information we are more likely to agree with. One of the main praises of the internet is that it allows us to learn about our world and engage with people that have different views. But, instead, we often use technology to create our own “safe spaces” and don’t make an effort to engage or expand our knowledge. In fact, studies show that those who use social media are more likely to be lonely.
Vaughn Davis, owner and creative director at advertising agency The Goat Farm and host of Sunday Social on RadioLIVE, says exposure to dissenting views and diverse perspectives is key for a healthy democracy – and something technology could actually assist with.
“A diverse media landscape is a good thing for everyone,” he says. “All points of views are considered, and power, whatever its flavour, is more likely to be held to account … One of the key shortcomings of digital and in particular social media is the degree to which it’s curated just for us,” he says. “Ten years ago, we would read a newspaper, or listen to a news bulletin, and hear stories we hadn’t sought out. At worst, irrelevant content might make us tune out or turn the page. More often, though, we’d be exposed to ideas, stories and opinions that didn’t align with our own. And that’s a good thing.”
The digital space, such as Twitter and Facebook, is anything but, he says. “Digital media is the opposite of that. We click on the stories we think we’ll find interesting, and are unlikely to challenge our thinking. And the more we do that, the more the news site or social network we’re on learns about us, and the more closely the stories it serves reflect our preferences and prejudices. The narrower a lens we see the world through, the worse we’re all off.”
This isn’t new, of course. Humans have always chosen media that matches their belief structure. But it’s just in overdrive now as the algorithms take over. The collection of data – such as likes and dislikes, or the types of stories clicked on or sites visited – is also a concern, Davis says, especially because corporate giants like Facebook and Google are ultimately businesses before anything else.
“The dominance of global media companies like these two make this a very real issue,” he says. “Both are controlled not by editors, but by algorithms. And the way in which each of them decides to promote certain content and suppress others is a closely guarded commercial secret. This isn’t a trivial issue. On the commercial front, Google was recently fined € 2.4 billion for serving its own shopping service results ahead of its paying customers’. The editorial side is even more worrying. Facebook or Google could, for example, adjust their algorithms to make news about political party A appear more visible than stories about political party B. Suddenly, it would seem, everyone is talking about party A… they must have something going for them, right? But what if Facebook and Google decided to tweak the news to reflect not our own personal preferences, but someone else’s agenda?
“Now I’m not saying either company is doing this. There doesn’t seem to be much stopping them though. If a newspaper editor is free to flatter one candidate and criticise another, why not a social network? The challenge as readers is to know when it’s happening, and to understand that what we see online is anything but random.”