Why You Should Keep Your Kids Off Youtube

Trillions - - Contents -

Warn­ing: Youtube is tar­geted young chil­dren with ma­li­cious videos de­signed to harm them.

The videos are cre­ated to look like pop­u­lar chil­dren's car­toons but con­tain harm­ful con­tent that only a hu­man with com­mon sense and who is ac­tu­ally pay­ing at­ten­tion would no­tice. These videos are au­to­mat­i­cally played un­less one turns off Auto Play.

The pro­pri­etary al­go­rithms de­vel­oped by Google that put the most pop­u­lar re­sults at the top when one per­forms an in­ter­net search made the com­pany bil­lions of dol­lars. The com­pany is now do­ing the same with its au­to­matic process for rec­om­mend­ing new videos to watch on Youtube and there is no Kid Safe fea­ture. Chil­dren can au­to­mat­i­cally be shown videos that they and no sane adult should ever have to see.

Google’s ap­proach to rec­om­mend­ing what is es­sen­tial self-ref­er­en­tial con­tent in its search al­go­rithms has been a sub­ject of con­cern for some time. At the very be­gin­ning of its ex­is­tence as a com­pany, Google’s founders cre­ated what was a bril­liant con­cept for search. Un­like its com­peti­tors, which at the time in­cluded long-van­ished com­pa­nies like Ly­cos and now-ac­quired ones like Ya­hoo!, Google did not use a cu­rated ap­proach for search, where rec­om­mended sites that might fit a given query had been at least par­tially-in­spected by hu­man be­ings for their value and rel­e­vance. Google’s ap­proach was dif­fer­ent. It used com­puter ‘bots’ to scour the web for po­ten­tial an­swers to a given search re­quest, then ranked them based on the num­ber of pages which ref­er­enced the spe­cific sites which came up in search. It was called “page rank”. The al­go­rithm has been tweaked and ad­justed many times since, to op­ti­mize search speed, pre­vent gaming of its al­go­rithm to ar­ti­fi­cially push re­sults higher up the search lad­der, elim­i­nate duplicate en­tries, and more. Yet that fun­da­men­tal con­cept re­mains the ba­sic back­bone of ev­ery Google search even now. And it stands alone as one of the most bril­liant in­no­va­tions of the in­ter­net age.

When Google ac­quired Youtube, the video post­ing site was per­haps best known as a rapidly-grow­ing place where just about any kind of dig­i­tal video could be up­loaded and “just run” without any­one ac­tu­ally look­ing at it be­fore it was made avail­able to ev­ery­one. It ap­peared at first Google did not know ex­actly what

it planned to do with this ac­qui­si­tion. It im­proved the code be­hind it and helped the plat­form scale up to the in­creas­ing de­mands placed on it as smart­phone users, cor­po­ra­tions, and even am­a­teur dig­i­tal video mak­ers be­gan to load up more con­tent there. It added search al­go­rithms like what was present on the reg­u­lar search site, to help users find more of what they wanted to watch. It also added ad­ver­tis­ing, both in sep­a­rate el­e­ments on the same page as the web­based videos and also embed­ded at the front of useru­ploaded videos.

Then it did two other things. One was to share a tiny bit of! its Youtube video ad rev­enue with some up­load­ers, which in­creased the num­ber of videos up­loaded. The sec­ond was to push rec­om­men­da­tions to view­ers without them even ask­ing and au­to­mat­i­cally play those videos by de­fault.

The com­bi­na­tion of those last two changes has pro­duced a phe­nom­e­non some re­fer to as “go­ing down the Youtube rab­bit hole”. That refers to what hap­pens to a user when he or she opens Youtube, of­ten on a smart­phone, and be­gins view­ing. Views can start with some­thing found in a search, or per­haps on the Youtube home page, trend­ing page or per­haps from sub­scrip­tions of video mak­ers a user de­cides they like to watch on a reg­u­lar ba­sis. The next thing one knows is they’ve been watch­ing con­tent for over an hour and more, of­ten click­ing on video af­ter video without even re­al­iz­ing how much time they have wasted watch­ing –!junk.

What Google didn't do was to ad­e­quately po­lice videos for harm­ful or il­le­gal con­tent. Any­one could up­load any­thing and if Google's lim­ited al­go­rithms didn't flag the con­tent it was made avail­able to ev­ery­one.

This process has cre­ated an industry of care­fully crafted video “click­bait”. That term used to be used for links to web­sites which were of­ten a ma­jor waste of time but were things peo­ple just couldn’t re­sist watch­ing. Ti­tles like “10 Things You’re Do­ing Wrong at Work”, “Why These Celebri­ties Won’t Ever Get Hired Again”, and “Top 5 Diet Trends for Sum­mer” pulled peo­ple in to web­sites with lit­tle value but a lot of page views be­cause of their ti­tles. Video equiv­a­lents in­clude com­pi­la­tions of the “10 Great­est Gui­tar Riffs of All Time”, funny cat and dog com­pi­la­tions, and bloop­ers from or satires of ma­jor movie re­leases or tele­vi­sion shows.

When Google sug­gests some­thing to you to watch, the of­fer­ings are of­ten eerily con­nected to things you are al­ready in­ter­ested in. For some rea­son that is sur­pris­ing to many that it can do h i id that Google al­ready knows so much about you any­way and is uniquely po­si­tioned to put to­gether a sug­gested Youtube playlist with laser pre­ci­sion. It knows all your search his­tory in­clud­ing the most re­cent. It knows things you have been look­ing for that you might buy. It knows how old you are (by anal­y­sis), where you live, where you work, and even has a good sense of what your in­come might be. It also knows what you have read and looked at on the web, as well as what the other videos were that you looked at the past.

For adults, the im­pact of this is mostly in­creased watch­ing of videos which are of in­creas­ingly less value. For chil­dren, the prob­lem is far uglier.

Kids, who have far less dis­cern­ing abil­ity to know when to switch to some­thing else or to stop watch­ing al­to­gether, are es­pe­cially vul­ner­a­ble to Google and Youtube video mak­ers’ abil­ity to tar­get them. That is bad enough. Un­like adults, though, kids of­ten just click on the very next thing in se­quence to watch, some­thing that’s ranked high be­cause it for one rea­son or an­other at­tracts the most view­ers. More and more of­ten, that con­tent is in­creas­ingly vi­o­lent, dis­gust­ing, or even satanic. And yes, this con­tent is tar­geted di­rectly at young chil­dren.

As re­ported in a fright­en­ing ar­ti­cle pub­lished by the New York Times in Novem­ber 2017, some of what is get­ting served up to chil­dren is some­thing even an adult would con­sider hor­ri­fy­ing. Ex­am­ples cited there in­cluded:

• “PAW Pa­trol Ba­bies Pre­tend to Die Sui­cide by Annabelle Hyp­no­tized”. This is a rip-off of what is al­ready a pop­u­lar and ac­cept­able an­i­mated se­ries for chil­dren, “Paw Pa­trol”. In the “real” shorts, Paw Pa­trol is a Nick­elodeon se­ries about a group of res­cue pups who save their town from mi­nor mishaps in their lives, like lost ele­phants, miss­ing kit­tens, or rock slides. It’s fun, it’s funny, maybe only oc­ca­sion­ally ed­u­ca­tional but at least some­what be­nign. This al­ter­na­tive short video has the same name in the ti­tle, mak­ing the reg­u­lar view­ers of the real show sus­cep­ti­ble to click­ing on it some­what au­to­mat­i­cally. Ex­cept this one in­cluded one scene where sev­eral of the main char­ac­ters were pic­tured in an out-of-con­trol car which crashed into a pole and ex­ploded in flames. An­other scene showed a char­ac­ter who was hyp­no­tized by a doll pos­sessed by a de­mon. That char­ac­ter ended up walk­ing off a roof and plum­met­ing to the ground.

• An­other “Paw Pa­trol” rip-off courtesy of ac­count Subin TV has scenes with the res­cue pups in a strip club. An­other part of the same video has one

of he pups hav­ing her car­toon legs am­pu­tated and re­placed with hu­man legs.

• “Mickey Mouse Baby in Trou­ble”. In this one a fake Mickey Mouse char­ac­ter is shown in a pool of blood in the cen­ter of an in­ter­sec­tion. Min­nie Mouse is watch­ing as he is bleed­ing out for all to see.

In a sep­a­rate ar­ti­cle on the same topic more re­cently pub­lished by The Guardian, some of the more fright­en­ing of the videos noted there in­clude:

• “Min­nie Mouse Choked Pizza for Eat­ing too Much”. This one again in­cluded the same fake Mickey Mouse-like char­ac­ters as the other one men­tioned above, but with far uglier, deadly, and dis­gust­ing ad­ven­tures.

• “PAW PA­TROL Ba­bies Pre­tend to Die MON­STER HANDS From Mir­ror! Paw Pa­trol Animation Pups Save for Kids”. This shows the pups put­ting on a scary clip-art mon­ster mask. Even­tu­ally sev­eral get lured away and die af­ter a haunted doll gets con­trol of them.

• A par­ody of the pop­u­lar chil­dren’s car­toon se­ries Peppa Pig, in which the same char­ac­ter goes to the den­tist with scary re­sults.

• Elle Mills, an ac­tual live video child star on YouTube, has posted videos of her­self in the mid­dle of a real-life anx­i­ety at­tack.

For this ar­ti­cle it was dis­turb­ing easy to find videos just like this with lit­tle ef­fort. As ex­am­ples of those:

• “Mickey Mouse & Min­nie Mouse Ba­bies BLOODY Hand Funny Story”. This has as its tar­get im­age a car­toon of Mickey smil­ing as he pulls off Min­nie Mouse’s hand and blood spat­ters ev­ery­where.

• “Peppa and the Mur­der Ma­chine”. In this stop-mo­tion film fea­tur­ing a Peppa Pig lookalike, “Peppa pig meets Ce­cilia the Mur­der Ma­chine”. This one had 1.4 mil­lion views.

• “Mickey Mouse Vam­pire Drink the Blood Re­serve in the Hospi­tal.” This was starts with a lech­er­ous Goofy and Don­ald Duck leer­ing at a bikinied Min­nie Mouse by a swimming pool and goes down­hill from there.

If all this sounds un­be­liev­able, re­mem­ber that there’s al­most no pos­si­bil­ity this con­tent would au­to­mat­i­cally show up on an adult’s au­to­matic Youtube play list. That is both be­cause adults have dif­fer­ent in­ter­ests and be­cause chil­dren are in­creas­ingly con­nected to the lu­cra­tive and al­go­rithm-driven Youtube Kids chan­nel. That spinoff, which was cre­ated by Google in at­tempt to at­tract younger view­ers while in the­ory pro­tect­ing them from more dis­turb­ing con­tent, has be­come very pop­u­lar. As of last year, it was at­tract­ing 11 mil­lion weekly view­ers. With so many view­ers, lots of new con­tent has to be pro­duced and au­to­mated ap­proaches to fil­ter­ing and rec­om­mend­ing videos had to be cre­ated. The un­for­tu­nate re­sult is that many videos which no child should ever see make it through the al­go­rithms.

And the prob­lem is not just the no­tice­ably harm­ful con­tent. These videos may also con­tain pow­er­ful sub­lim­i­nal mes­sages that can't be de­tected con­sciously but do in­flu­ence the sub-con­scious.

One pop­u­lar way these video mak­ers beat the lim­ited Google al­go­rithms is through nam­ing and even video im­ages which echo the reg­u­lar and ac­cept­able al­ter­na­tives. The videos which take ad­van­tage of Paw Pa­trol and Mickey Mouse ref­er­ences, such as were noted above, are typ­i­cal of that ap­proach. The live videos, which range from the un­set­tling of Elle Mills’ videos to far more dis­turb­ing ones which also some­how made it through the al­go­rithm fil­ters.

Many of these videos are also likely not in­tended for chil­dren at all. Some are even prop­erly marked as “Not for Kids”, which prob­a­bly gets them au­to­mat­i­cally re­jected from Youtube Kids. Far too many, how­ever, are both dis­gust­ing and clearly in­tended to con­nect to chil­dren. The im­pres­sion­able minds of the young will be­gin watch­ing per­haps know­ing they’re see­ing some­thing they re­ally are not sup­posed to see. In the end they will end up trau­ma­tized and pos­si­bly plagued with night­mares for weeks af­ter ex­po­sure to this kind of con­tent.

Google may claim that it means no harm but the fact that Google chooses to al­low these sin­is­ter videos get in front of chil­dren at all is a ma­jor black mark for the com­pany. Google will protest that it is do­ing the best it can, tweak the al­go­rithms oc­ca­sion­ally, but of­ten will shrug its shoul­ders that there is not much more it can do. How­ever, it would be very easy to po­lice the lim­ited amount of con­tent for kids. Google just doesn't want to spend even a minis­cule por­tion of its sur­plus of tens of bil­lions of dol­lars do­ing so.

All the while it will pull in even more bil­lions of dol­lars from those push­ing such con­tent, as ad­ver­tis­ing rev­enue piles up even higher. If a few chil­dren end up see­ing the wrong thing, Google will say par­ents should just mon­i­tor more of­ten.

For those who won­der how evil Google re­ally is, con­sider Google's han­dling of its Brazil­ian so­cial

net­work­ing site Orkut, which was widely used for dis­tribut­ing child pornog­ra­phy for years. In 2005, the Brazil­ian government re­peat­edly asked Google for as­sis­tance in clamp­ing down on those us­ing the site for child porn but Google re­fused and claimed that it would not turn over in­for­ma­tion on users or re­move the con­tent be­cause the data was on U.S. servers and there­fore Google was not bound by Brazil­ian law. It wasn't un­til 2008 when ad­ver­tis­ers started boy­cotting Orkut and the Brazil­ian government threat­ened Google executives in Brazil with ar­rest and mas­sive fines that Google fi­nally re­lented and started pro­vid­ing in­for­ma­tion on pe­dophiles, re­moved some of the il­le­gal con­tent and in­stalled fil­ters mak­ing it more dif­fi­cult for peo­ple to post il­le­gal con­tent.

Google did not com­ply be­cause they cared about the chil­dren be­ing abused and ex­ploited but only be­cause not do­ing so was go­ing to cost them a sig­nif­i­cant amount of money. Google man­age­ment con­sciously chose to put mea­ger profit they didn't even need over the wel­fare of chil­dren. That is in­deed evil.

The child ex­ploita­tion agenda of Google con­tin­ues even today by fea­tur­ing videos on Youtube with child nu­dity and sex­u­ally ex­plicit ma­te­rial with chil­dren. Youtube even rec­om­mends more such con­tent to make it eas­ier for pe­dophiles. How­ever, one must have an ac­count to view some of it, which any­one can eas­ily get.

Par­ents should not let young chil­dren have un­su­per­vised ac­cess to the In­ter­net ever and should not lets kids watch Youtube or other stream­ing video sites un­less the videos have first been care­fully pre­viewed by a com­pe­tent adult.

Con­tent on the of­fi­cial chan­nels of pro­duc­tion com­pa­nies and broad­cast­ers can usu­ally be as­sumed to be safer than con­tent posted by in­di­vid­u­als. So, if your kid wants to watch "Paw Pa­trol" episodes, do so only from the Nick Jr. chan­nel and not the chan­nels that pi­rate and imi­tate Nick­elodeon's con­tent.


Par­ents should also sim­ply do every­thing they can to pry smart­phones, tablets, and mini-com­put­ers from their kids’ hands. Lock up the In­ter­net mo­dem when a par­ent is not home to su­per­vise In­ter­net ac­ces. Get kids out­side in na­ture, con­nect them with other chil­dren to play with, read books to them, and – when they’re old enough – en­cour­age them to read books on their own. And above all, have them stop spend­ing time watch­ing end­less hours of Youtube video. It’s a waste of time, po­ten­tially psy­cho­log­i­cally dam­ag­ing be­cause of the con­tent de­scribed here, and a very bad type of ex­pe­ri­ence to rely on for chil­dren’s de­vel­op­ment.

Im­age by ar­maeater, CC

Newspapers in English

Newspapers from USA

© PressReader. All rights reserved.