National Post

We’ve surrendere­d to the algorithm

- Christine Emba

‘Something is wrong on the internet,” declares an essay trending in tech circles. But the issue isn’t Russian ads or Twitter harassers. It’s children’s videos.

The piece, by tech writer James Bridle, was published on the heels of a report from The New York Times that described disquietin­g problems with the popular YouTube Kids app. Parents have been handing their children an iPad to watch videos of Peppa Pig or Elsa from Frozen, only for the supposedly family- friendly platform to offer up some disturbing versions of the same. In clips camouflage­d among more benign videos, Peppa drinks bleach instead of naming vegetables. Elsa might appear as a gorecovere­d zombie or even in a sexually compromisi­ng pos- ition with Spider-Man.

The phenomenon is alarming, to say the least, and YouTube has said that it’s in the process of implementi­ng new filtering methods. But the source of the problem will remain. In fact, it’s the site’s most important tool — and increasing­ly, ours.

YouTube suggests search results and “up next” videos using proprietar­y algorithms: computer programs that, based on a particular set of guidelines and trained on vast sets of user data, determine what content to recommend or to hide from a particular user. They work well enough — the company claims that in the past 30 days, only 0.005 per cent of YouTube Kids videos have been flagged as inappropri­ate. But as these latest reports show, no piece of code is perfect.

Similar algorithms serve as the engine behind almost all of the most successful tech companies, powering everything from Facebook’s news feed to Google’s search results ( Google, incidental­ly, is the parent company of YouTube). Naturally, these mysterious tools have become convenient scapegoats for many of the content problems we face today, from bizarre videos aimed at vulnerable children to misinforma­tion i n news feeds during the 2016 election.

Clearly, Silicon Valley has some work to do. But in addition to demanding more accountabi­lity from companies after their tools go awry, we should demand more responsibi­lity from ourselves. We need to think about whether we want to reduce our own reliance on corporate algorithms, and if so, how.

As the internet has be- come an ever- larger part of our lives, we’ve come to rely on these proprietar­y bits of code as shortcuts for organizing the world. Algorithms sort through informatio­n and make decisions for us when we don’t have the capability ( or perhaps just the energy) to do it ourselves. Need to distract the kids? Send ’em to the wildly educationa­l world of YouTube. The app will pick out the safe videos — probably. The mechanism may be skewed by profit motives, biased by its data sets or just generally inscrutabl­e, but is that any reason to give it up?

Why aren’ t we more alarmed by t his? Maybe because we’ve always used decision- making shortcuts, and t hey’ ve always had flaws. How would we have chosen a children’s video before YouTube? Perhaps we’d act on a recommenda­tion from a librarian, or a peer group, or even a National Legion of Decency list. These sources, too, were insular, subject to personal biases and limited in scope.

Still, there were meaningful difference­s between those old- school shortcuts and today’s machine- learning algorithms. The former had at least some oversight and regulation; it’s unlikely that a public library would l end out nursery rhyme snuff films. Shared community values made it clear which choices were being f avoured, and why. And human judgment — today almost quaint — occasional­ly allowed for serendipit­y in a positive direction. One might come across a resource not carefully calibrated to agree only with one’s stated preference­s, and be the better for it.

Is there any way to steer our current algorithmi­c regime in that more human direction? It’s not clear how. Some lawmakers have suggested that companies release their algorithms for public review; others propose regulating corporate algorithms. For now, the lesson for everyday users may just be an urgent need for i ncreased awareness, a reminder that maybe we shouldn’t place all of our trust in a decision- making function that we don’t fully understand. Frightenin­g children’s videos are, among other things, a wake-up call. If there’s something wrong on the internet, we should do more than just watch.

WE’VE COME TO RELY ON THESE PROPRIETAR­Y BITS OF CODE.

Newspapers in English

Newspapers from Canada