You are currently viewing Meta Panel Flags Insufficient Moderation Of Non-English Content In India

Meta Panel Flags Insufficient Moderation Of Non-English Content In India


Our decisions so far, which covered posts from India and Ethiopia, have raised concerns about whether Meta has invested sufficient resources in moderating content in languages other than English: Report

This is the maiden annual report presented by the Oversight Board, which oversees content moderation decisions on the tech giant’s platforms – Facebook and Instagram

Users across the globe submitted more than 1.15 Mn appeals between October 2020 and December 2021

Meta’s Oversight Board, in its first annual report, has flagged concerns related to insufficient moderation of non-English content in India, .

The advisory group claims to be an independent body that oversees content moderation decisions on the tech giant’s platforms – Facebook and Instagram. 

“Our decisions so far, which covered posts from India and Ethiopia, have raised concerns about whether Meta has invested sufficient resources in moderating content in languages other than English,” the report said. 

It also noted that the actual number of content moderation issues were higher, compared to the complaints received by the Oversight Board. 

“If anything, we have reason to believe that users in Asia, Africa, and the Middle East experience more, not fewer, problems with Meta’s platforms than other parts of the world,” it added. 

In total, users across the globe submitted more than a million appeals between October 2020 and December 2021. A majority of them, nearly 80%, related to appeals concerning restoration of content which supposedly violated the platforms’ rules on bullying, hate speech, or violence and incitement.

The report also listed two instances with regards to India. The first appeal concerned a 17-minute video shared by a user with the caption – Rashtriya Swayamsevak Sangh (RSS) and India’s ruling party Bharatiya Janata Party (BJP) were threatening to kill Sikhs, a minority religious group in India.

The post was taken down by a human moderator on the grounds that it allegedly violated the platform’s Dangerous Individuals and Organisations Community Standard. The takedown was appealed by the user before the Oversight Board, after which the content piece was restored. 

The second case involved a post that depicted a man holding a sword with the text underneath that described the French president Emmanuel Macron as the ‘devil.’ The content related to the time when anti-Macron protests were organised across the globe over his statements that described Islam in ‘crisis’ in the aftermath of beheading a French teacher for allegedly showing pictures of prophet Muhammad. 

Meta also refused to answer one question about the appeal related to anti-France protests in India. It related to whether Meta had previously enforced the same community standard violations against the said user or group. In response, Meta said that the user’s previous behaviour on the platform was irrelevant to the Board’s determination. 

In addition, the platform also refused to answer two questions regarding the appeal that related to BJP and RSS. The first question asked what specific language in the content caused Meta to remove it under the violated community guidelines. Meta responded by saying that it was unable to identify the specific language that led to the erroneous conclusion.

The second question sought to know how many ‘strikes’ were stipulated for imposing account restrictions and how many violations under the Dangerous Individuals and Organisations policy were required for account-level restrictions. To this, Meta said that the information was not reasonably required for decision-making.

In conclusion, the report said, “While social media has to a large degree fulfilled its early promise of bringing billions together, it has also created new ways for people to inflict harm on others.”

The advisory panel also plans to broaden its scope and secure greater access to data over the course of the current year. 

Earlier in November last year, a report quoting activists had said that Facebook was stalling the release of its human rights impact assessment about hate speech on the platform in India. Post that, many activist groups had written to the platform calling for immediate release of the said report.

In addition, the platform has also been under scrutiny since September last year after leaked internal documents by whistleblower Frances Haugen revealed that the platform was struggling to police problematic content. 

The BJP has also been in dock ever since Facebook India’s policy chief Ankhi Das resigned in October last year after a media report surfaced that alleged that she had interfered in Facebook’s content moderation policy to favour some leaders of the ruling party.



Source link

Leave a Reply