Machines Helping to Address Violative Content
Machines are allowing us to flag content for review at scale, helping us remove millions of violative videos before they are ever viewed. And our investment in machine learning to help speed up removals is paying off across high-risk, low-volume areas (like violent extremism) and in high-volume areas (like spam).
Highlights from the report — reflecting data from October – December 2017 — show:
- We removed over 8 million videos from YouTube during these months.1The majority of these 8 million videos were mostly spam or people attempting to upload adult content – and represent a fraction of a percent of YouTube’s total views during this time period.2
- 6.7 million were first flagged for review by machines rather than humans
- Of those 6.7 million videos, 76 percent were removed before they received a single view.
For example, at the beginning of 2017, 8 percent of the videos flagged and removed for violent extremism were taken down with fewer than 10 views.3 We introduced machine learning flagging in June 2017. Now more than half of the videos we remove for violent extremism have fewer than 10 views.
Sharing is Caring!