Editorial | YouTube needs to make deliberate changes in order to protect children

Back to Article
Back to Article

Editorial | YouTube needs to make deliberate changes in order to protect children

Image via Wikimedia Commons

Image via Wikimedia Commons

Image via Wikimedia Commons

By Pitt News Editorial Board

Hang on for a minute...we're trying to find some more stories you might like.


Email This Story






Uploading a YouTube video of two young girls playing outside in bathing suits might appear to be a harmless action, but recent reports on the video website’s algorithm suggest that doing so could unintentionally attract sexual predators.

The New York Times recently reported that Youtube’s algorithm, which the site uses to personalize content recommendations for its users, was showing a video of girls playing outside in two-piece swimsuits to users who had watched other videos of prepubescent, half-dressed children — after they had watched sexually themed content. The result of this was a loop of videos that experts say sexualize children.

Since the allegations arose, YouTube has made several half-hearted attempts to curb the harm of the algorithm, but they haven’t made any changes adequate enough to fix the algorithm from attracting sexual predators, or allowing innocent home videos to enter the stream of those who watch sexually promiscuous adult content. The streaming service needs to step up and make major changes in order to protect children and families.

Youtube banned comments on many videos of young children in February in an effort to block predators, according to The New York Times. After blocking the comments, YouTube said that they would be examining ways to better the algorithm over the next few months.

“Recently, there have been some deeply concerning incidents regarding child safety on YouTube,” Susan Wojcicki, the company’s chief executive, wrote in a post on Twitter. “Nothing is more important to us than ensuring the safety of young people on the platform.”

But predators on YouTube are difficult to keep up with. Sometimes, their actions are more subtle and therefore more difficult to detect. Commenters will often post timestamps that direct others to sections of the video where a girl’s backside or bare legs are visible. Others just post suggestive emojis, according to a report by The New York Times.

This, along with YouTube’s failure to address the recommendation system — which is the list of videos that play on a continuous loop after the first video that one deliberately watches — has done little to improve the streaming service’s problem. And even after being warned about the dangerous algorithm activity, YouTube didn’t make very much of an effort to improve safety, according to The New York Times.

“When The Times alerted YouTube that its system was circulating family videos to people seemingly motivated by sexual interest in children, the company removed several but left up many others,” the Times reported.

This is concerning, as it shows that YouTube is not as committed to the safety of children as it claims to be. The algorithm doesn’t just cater to those who are attracted to suggestive videos of children either. It can also show up in the feed of those who have been watching any kind of sexual content on YouTube.

Though the recommendation system changed slightly after the algorithm tweaks, YouTube said that it was not the result of a deliberate policy change.

But a deliberate policy change is exactly what YouTube needs if it wants to protect children, families and people who have no intention of watching revealing videos of young children.

Leave a comment.