In its third quarter, YouTube has deleted more than 58 million videos and 224 million comments which have violated any of the policies of its parent company, Alphabet. The company has faced backlash as a lot of content on YouTube has been proved to be extremist in nature and in some cases, its content has reportedly also incited violence.
YouTube has been pressurized by government officials and interest groups in the United States, Asia and Europe to remove controversial content from its website. The videos which have been removed include extremism, child pornography and spam.
It has been proposed by the European Union that online services should pay a fine unless they remove objectionable content that the government has ordered them to within an hour.
Furthermore, a source from India’s Ministry of Home Affairs has been quoted in a report saying that social media firms have complied with authorities’ request to remove any problematic content within 36 hours.
In order to show that it has taken the governments and interest groups seriously, YouTube has started issuing quarterly reports about its efforts.
There are automated detection tools which help YouTube to identify nudity, spams and content which can potentially incite violence. However, in the case of videos based on hateful rhetoric and dangerous behaviour the automated detection tools aren’t much effective. In such cases, YouTube identifies problematic content when users report it. However, this also means that the content may get viewed widely before getting deleted from the website. In an attempt to solve this problem, Google has added thousands of moderators this year, which has taken its network of moderators to over 10,000. The very purpose of getting these moderators is to get the content review user reports faster.
YouTube has deleted almost 10,400 videos for violent extremism and 2,79,000 videos for child safety in September, based on its own data.
YouTube refused to comment upon its 2019 growth plans. According to it, it is rather inconvenient to pre-screen every video.
The company has also revealed the number of YouTube accounts removed by Google in Q3 2018. These accounts were removed for either for three policy violations or something outrageous like child pornography within a span of 90 days. YouTube has removed around 1.67 million channels and 50.2 million videos based on such violations.
Almost 80% of channels were deleted because of spam, while 13% were taken down because of nudity. Furthermore, 4.5% of channels that have been removed involved child pornography. which are considered bad for children.Other than this, around 7.8 million videos were removed individually based on policy violations in the last quarter.