Powered by

Home Top Stories

Facebook ignores warnings on hate speech and hate content

Facebook on hate speech; Between 2018 and 2020, there were three internal reports on the operation of the social media platform Facebook

By Ground report
New Update
Is Facebook Dying, Why People Loosing Interest In It?

Ground Report | New Delhi: Facebook on hate speech; Between 2018 and 2020, there were three internal reports on the operation of the social media platform Facebook in India - ranging from 'polarizing content', 'fake and unsubstantiated' messages to an ever-increasing number of content that 'maligned' minority communities.

According to a report of Indian Express, despite these alerts from the inspection staff, at an internal review meeting in 2019, then-vice president Chris Cox described these issues as "comparatively less widespread on his platform".

In January-February 2019, a few months before the Lok Sabha elections in the country, two reports were presented in which there was talk of increasing 'hate speech' and 'disturbing content'. The third report, which came in late August 2020, stated that the platform's AI tools "failed to recognize local languages ​​and, therefore, could not identify hate speech and problematic content".

ALSO READ: Who is Facebook whistleblower Frances Haugen

Despite all this, Cox's meeting details say that "people generally feel safe on our platform, according to the survey, experts have told us that the situation in the country is stable".

The serious matter comes to the fore through documents shared with the United States Securities and Exchange Commission (SEC) and the US Congress by a lawyer for former Facebook employee and whistleblower Frances Hogan.

The document submitted to the US Congress has been reviewed by a consortium of global news organizations, The Indian Express is part of it. This review meeting of Facebook took place a month before the announcement of the dates for the Lok Sabha elections in India. On April 11, 2019, the Election Commission in the country announced the dates for the Lok Sabha elections.

Facebook's first internal report titled "Advisory Harmful Networks: India Case Study" states that posts receiving up to 40 percent of the ViewPort View (VPV) posts on Facebook in West Bengal were fake and unauthorized.

VPV is a Facebook metric that measures how many users have actually viewed content. Cox left Facebook in March 2019 and then rejoined the company in June 2020 as Chief Product Officer.

ALSO READ: Facebook unable to stop hate speech in India: Reports

According to the report, the 'Watch' and 'Live' tabs are the only Surfaces that contain content when a user is not connected with friends. “The quality of this content…is not ideal,” the employee reports, with the algorithm often suggesting “a bunch of softcore adult films” to the user.

Over the next two weeks, and especially after the February 14 Pulwama terror attack, the algorithm started suggesting groups and pages that mostly centered around politics and military content. The test user said that he has seen "more images of dead people in the last 3 weeks than I have seen in my entire life".

Facebook told The Indian Express in October that it had invested significantly in technology to find hate speech in various languages, including Hindi and Bengali. "As a result, we have halved the amount of hate speech people see this year. Today, it has dropped to 0.05 percent. Hate speech against marginalized groups, including Muslims, is on the rise globally. That's why we're enforcement." and are committed to updating our policies as hate speech develops online,” a Facebook spokesperson was quoted as saying.

You can connect with Ground Report on FacebookTwitterInstagram, and WhatsappFor suggestions and writeups mail us at [email protected]