Facebook reveals one in 1,000 content views on its platform include hate speech

For the first time, Facebook has disclosed numbers on the prevalence of hate speech on its platform.

Social Networks

A young girl uses a personal computer in Brisbane, Wednesday, Oct. 16, 2013. (AAP Image/Dan Peled) Source: AAP

Facebook, under scrutiny over its policing of abuses, particularly around November’s US presidential election, has for the first time, released its quarterly content moderation report.

The world’s largest social media company estimates that out of every 10,000 content views in the third quarter, 10 to 11 included hate speech.

On a call with reporters, Facebook’s head of safety and integrity Guy Rosen said that from 1 March to the 3 November election, the company removed more than 265,000 pieces of content from Facebook and Instagram in the United States for violating its voter interference policies.
Facebook also said it took action on 22.1 million pieces of hate speech content in the third quarter, about 95% of which was proactively identified. It took action on 22.5 million in the previous quarter.

The company defines "taking action" as removing content, covering it with a warning, disabling accounts, or escalating it to external agencies.

Facebook’s photo-sharing site Instagram took action on 6.5 million pieces of hate speech content, up from 3.2 million in Q2. About 95% of this was proactively identified, a 10% increase from the previous quarter.

This summer, civil rights groups organised a widespread Facebook advertising boycott to try to pressure social media companies to act against hate speech.
Mark Zuckerberg of Facebook testifies via videolink to the US Senate committee
Mark Zuckerberg of Facebook testifies via videolink to the US Senate committee Source: Getty Images
In October, Facebook said it was updating its hate speech policy to ban any content that denies or distorts the Holocaust, a turnaround from public comments Facebook’s Chief Executive Mark Zuckerberg had made about what should be allowed on the platform.

Facebook said it took action on 19.2 million pieces of violent and graphic content in the third quarter, up from 15 million in the second. On Instagram, it took action on 4.1 million pieces of violent and graphic content, up from 3.1 million in the second quarter. 

Mr Rosen said the company expected to have an independent audit of its content enforcement numbers “over the course of 2021."
Earlier this week, Zuckerberg and Twitter Inc CEO Jack Dorsey were grilled by Congress on their companies’ content moderation practices, from Republican allegations of political bias to decisions about violent speech. 

Last week, Reuters reported that Zuckerberg told an all-staff meeting that former Trump White House adviser Steve Bannon had not violated enough of the company’s policies to justify suspension when he urged the beheading of two senior US officials.

The company has also been criticised in recent months for allowing rapidly-growing Facebook groups sharing false election claims and violent rhetoric to gain traction.


Share
3 min read
Published 20 November 2020 6:23am
Source: Reuters



Share this with family and friends