Beloved Disney characters in violent or sexual situations are sneaking and passing the YouTube Kids automated filters. The app has added a new settings to give parents more control over what videos their kids can see.
The option is called “approved content only,” so parents can whitelist channels and subjects. The effort comes four months after reports called attention to troves of videos with inappropriate themes on the video-sharing site’s kid-friendly platform. Content for YouTube Kids is selected from the main YouTube app and screened using machine learning algorithms
Once the new setting is turned on, users can pick collections from trusted creators such as PBS and Kidz Bop, or themed collections curated by YouTube Kids itself. YouTube also will be launching another tool later this year that will let parents choose every video or channel their kid can see in the app.
When a video for children is uploaded to the main YouTube platform, it is not automatically added to the YouTube Kids library. The videos are reviewed by machine learning algorithms to determine whether or not they are appropriate for the app.
A human doesn’t check the videos before they’re added, but parents can flag videos they find alarming later and a content reviewer will check it out.
However, it’s unlikely parents are constantly watching YouTube Kids videos along with their children. It’s possible this safety guard isn’t sufficient for catching every odd video your kid might see.
“While no system is perfect, we continue to fine-tune, rigorously test and improve our filters for this more open version of our app. And, as always, we encourage parents to block and flag videos for review that they don’t think should be in the YouTube Kids app,” the company said in a blog post announcing the new features.
PICTURE CREDIT: YouTube.com