In the past three years since its launch, YouTube’s Kids app, which is designed to surface child-friendly content, has come under fire numerous times for featuring videos that are inappropriate – and in some cases, downright disturbing – for young viewers.
Over the weekend, BuzzFeed News reported that the company is trying a new strategy to keep things kosher: it’s going to have – not algorithms – curate videos for the platform.
The app is slated to be updated with an option for parents to choose between handpicked videos, and those suggested by YouTube’s algorithms. It should become available in the coming weeks.
That should help prevent your children from stumbling upon videos of beloved cartoon characters being tortured, and of conspiracy theories about Freemasons sacrificing humans. While algorithms may be more efficient when it comes to generating endless playlists of clips to keep your kids occupied, they may not be able to distinguish between what’s safe to watch and what could scar them for life; this is one thing that humans are probably still better at than robots.
We’ve contacted YouTube to learn more and will update this post if there’s a response.
The Next Web’s 2018 conference is almost here, and it’ll be 💥💥. Find out all about our tracks here.