YouTube pulled 8.3m videos in three months

In December Google said it was hiring 10,000 people in 2018 to address policy violations across its platforms

LEON NEAL GETTY IMAGES In December Google said it was hiring 10,000 people in 2018 to address policy violations across its platforms

Among the total removed videos, about 6.7 million were flagged by machines only.

Regarding videos containing "violent extremism," which is banned on the platform, only 8% of such videos were flagged and removed in early 2017 before 10 views had taken place.

The first YouTube Community Guidelines Enforcement Report - a part of Google's Transparency Report site - covers October to December 2017 and will be released quarterly.

The video site also details that leveraging ML requires more people to review content, with YouTube having staffed full-time specialists with expertise in violent extremism, counterterrorism, and human rights, as well as expanded regional expert teams. In a statement, the company has stated, "And our investment in machine learning to help speed up removals is paying off across high-risk, low-volume areas (like violent extremism) and in high-volume areas (like spam)".

Apple Watch Series 3 cellular to launch in India on May 11
Airtel earlier said that the pre-registration for Apple Watch Series 2 will start on May 4, and be available from May 11. Coming to Airtel, the Apple Watch Series 3 Cellular will be available on Airtel Online Store starting May 11.

YouTube appeared to be quite chuffed about this, as it noted that 76 per cent of those videos were scraped off the site before they gained even a single view. In the last quarter of 2017, India ranked first in the list of countries from which YouTube received the most human flags, ranked by total volume. "When it's brought to our attention that a video or channel has been removed mistakenly, we act quickly to reinstate it", YouTube said at the time. People can select a reason when they flag a video. In this case, YouTube uses the technology to automatically spot objectionable content.

Flags from human detection can come from a user or a member of YouTube's Trusted Flagger programme which includes individuals, NGOs, and government agencies that are particularly effective at notifying YouTube of content that violates Community Guidelines. Now it says it has nearly reached that goal and also hired more full-time anti-abuse experts and expanded their regional teams.

This, all in all, is largely attributed to machine learning. On the other hand, it is a victory for people, including free speech activists, who have called for social media platforms to be more transparent about how they handle flagged content and policy violations, and may put more pressure on Facebook and Twitter.

Latest News