YouTube’s first quarterly ‘enforcement report’ reveals the website removed 8.3 million videos between October and December 2017 for breaching its community guidelines. The figure does not include videos removed for copyright or legal reasons.
In a blogpost, YouTube said the “majority of these 8 million videos were spam or people attempting to upload adult content.”
Sexually explicit videos attracted 9.1 million reports from users of the Google-owned video portal, while 4.7 million were reported for hateful or abusive content. Most complaints came emanated India, the US or Brazil.
YouTube said its algorithms had flagged 6.7 million videos that had then been sent to human moderators and removed from the site. Of those, some 76 per cent had not received a single view on YouTube, other than by the moderators.
YouTube said the report “help(s) show the progress we’re making in removing violative content from our platform”.