Advertisement

I tried to childproof YouTube for a 3 year old. It’s harder than it should be.

Click to play video: 'Parents are being warned to watch out for disturbing content on YouTube Kids'
Parents are being warned to watch out for disturbing content on YouTube Kids
WATCH: YouTube is being accused of allowing violent and disturbing content to appear on its Kids channel – Nov 8, 2017

We share a house with a three year old whose real name isn’t Funz, though we call him that (it’s short for Fun Size, which isn’t his real name either).

For better or worse, Funz spends a certain amount of time in his high chair in front of an iPad so adults can get something else done, like making dinner.

Sometimes he watches Netflix, which has lots of high-quality kids’ programming, including all seasons of Paw Patrol (the current favourite).

And sometimes he watches YouTube, which also has lots of high-quality kids’ programming – Blippi, the wholesome but inexplicably named Little Baby Bum, whose version of Wheels on the Bus is at two billion views and counting, and, as far as I can tell, every episode of Mighty Machines ever produced.

So Funz and I make a decision about what he’s going to watch, which involves something like choosing between the Mighty Machines episode with the red tractor or the one with the yellow tractor.

Story continues below advertisement

And all that is fine, except that if I’m not paying close attention (bearing in mind that part of the point of plugging him in to video in the first place is that an adult can focus on something else) the decision on the next video is going to be made by an amoral machine-learning algorithm.

WATCH: A British YouTube personality needed help from firefighters to remove his head from a microwave he had cemented it inside of as part of a stunt.
Click to play video: 'Fire crews rescue YouTuber who cemented head into microwave as part of stunt'
Fire crews rescue YouTuber who cemented head into microwave as part of stunt

Because as we started to find out last fall, YouTube has a really unsavoury dark side — conspiracy theories about mass shootings, monetized child abuse, dank racial theories served up by autoplay.

Receive the latest medical news and health information delivered to you every Sunday.

Get weekly health news

Receive the latest medical news and health information delivered to you every Sunday.
By providing your email address, you have read and agree to Global News' Terms and Conditions and Privacy Policy.

You would think that the app YouTube Kids would be more of a walled garden, and in theory it’s supposed to be.

But videos are served up by algorithm there too, some produced by unscrupulous people who have learned to manipulate the algorithm.

Story continues below advertisement

In November, the New York Times interviewed an Indiana mother who found her three-year-old watching a now-removed video on YouTube Kids called ‘PAW Patrol Babies Pretend to Die Suicide by Annabelle Hypnotized’. “In the video Isaac watched, some characters died and one walked off a roof after being hypnotized by a likeness of a doll possessed by a demon,” the Times reported.

A Business Insider investigation found that “YouTube Kids featured many conspiracy theory videos which make claims that the world is flat, that the moon landing was faked, and that the planet is ruled by reptile-human hybrids.” After reporter James Cook watched a few conspiracy theory videos on YouTube Kids (in itself quite a concept) he found that he had inadvertently trained the algorithm to offer him more: the top video on the site as he saw it was pitching a conspiracy theory about aliens on the moon.

“Someone or something or some combination of people and things is using YouTube to systematically frighten, traumatize, and abuse children, automatically and at scale,” technology writer James Bridle wrote in a widely read Medium post. (Long read, worth your time.)

My own conclusions: certain YouTube channels are rock-solid safe places for small children, but the platform as a whole, driven by amoral machine-learning algorithms, isn’t safe at all. And the platform’s attempts to create safe spaces for kids haven’t been consistently successful.

Story continues below advertisement

The mechanism by which it becomes unsafe is autoplay — YouTube’s effort to serve up your next video based on an algorithm’s idea of what you might want to watch, based on what you watched in the past.

In principle, you can stop autoplay, and in theory, this is easy to do — just turn off the button in the upper right of the screen, which will go from blue to grey. This is in Safari:

Problem solved, right?

Not really. Autoplay stealthily turns itself back on every time you open a new tab — see below. And there’s no way to set it as a default setting for the account.

(Many, many people have complained about this.)

The other simple and obvious solution is to whitelist a few trusted channels.

Story continues below advertisement

But while YouTube is set up to easily block objectionable channels — once you encounter them, that is — you can’t restrict it to specific channels. (Subscribing to channels doesn’t restrict Autoplay to those channels.)

I had better luck when I downloaded the YouTube app — which seems to remember that Autoplay is turned off, once it’s turned off — and signed in with a new Google account I created for the purpose. (Security issues aside, though, I prefer to use YouTube from a browser for children, though — you can link to the videos that you’ll want to quickly find to watch over and over again, while adults will want to be offered variety.)

Sponsored content

AdChoices