YouTube had previously put a policy in place that saw videos such as these largely demonetized, but they could still be found in the YouTube Kids app, making them too easy for children to stumble upon. Given that YouTube Kids is often being used by children, it might be unlikely that numerous viewers will actually be in a place to do so.
YouTube has announced a clampdown on videos that show popular children's TV characters in violent and frightening scenes after it emerged that hundreds of disturbing videos featured on the website. Numerous videos have been criticized for containing disturbing, sometimes violent or inappropriate content that is targeted at the platform's youngest viewers. "The YouTube team is made up of parents who are committed to improving our apps and getting this right", said Juniper Downs, YouTube's director of policy.
Since YouTube Kids launched in February 2015, the algorithimically-driven app has been criticized for lacking controls to restrict kid-unfriendly videos, as well as allowing commercially-oriented content targeted at kids. No mention of a new policy from YouTube was discussed with Mashable during the reporting of our original piece in October.
Adidas Unveil 2018 FIFA World Cup Ball
A similar design, the Telstar Durlast, was used in Germany four years later and it retains an iconic status among football fans. Addidas have announced that the official match ball for the 2018 World Cup in Russian Federation will be the "Telstar".
According to YouTube's policy for age-restricted content, as part of determining if videos should be blocked from YouTube Kids, moderators will evaluate vulgar language, violence and disturbing imagery, nudity and sexually suggestive content, and the portrayal of harmful or risky activities. Since uploaded videos don't appear in the YouTube Kids app for a few days, this procedure should provide enough time for inappropriate content to be flagged. Some of them appear to being created by robots, which add various key words into videos in an attempt to play YouTube, while others seem to be actively made by people who are looking to disturb children who watch them. An algorithmic filter will first scan for inappropriate content. Flagged content will be age restricted, and users won't be able to see those videos if they're not logged in on accounts registered to users 18 years or older. No age-restricted content is allowed in the YouTube Kids app at all. The content is then reviewed by what YouTube says is one of thousands of moderators working around the world.
YouTube is trying to walk a fine line between owning up to this problem and arguing that the issue is relatively minor. "If you find a video that you think should not be in the app, you can block it and flag it for review". And, the company is willing to forgo additional ad revenue - and there is a lot of money flowing through this segment of the industry - if that's what it takes to ensure YouTube Kids feels like a safe experience for families.