In accordance with the Information Technology (Intermediary Guidelines and Digital Media Ethics Code) Rules, 2021, Meta (formerly facebook) has released its India monthly report for Instagram and Facebook for the time period between January 1 and Jan 31, 2022.
According to the report, Facebook took action against more than 19.3 Mn pieces of content, primarily for spam (13.8 Mn), violent and graphic content (2.1 Mn), adult nudity and sexual activity (1.5 Mn), child endangerment and sexual exploitation (796.8k) and suicide and self-injury (374.3k) and other violations. On average, about 99.23% of the actions taken were taken proactively by the platform
Meanwhile, Instagram took action against a total of more than 2.4 Mn pieces of content, including for violations that includes content featuring suicide and self-injury (891.9k), violent and graphic content (600.8k) adult nudity and sexual activity (461.9k), bullying and harrasment. The platform took proactive action around 96.52% of the time on average.
The ‘content actioned’ number refers to the number of pieces of content including posts, photos, videos or comments against which Meta has taken action for going against its community standards.
A high proactive content action rate of 99.23% and 96.52% on Facebook and Instagram respectively means that a majority of the content that was found to violate standards were flagged by these platforms without the need for user reporting.
During the same period (between 1 and 31 Jan 2022), Meta received over 531 reports through the Indian grievance mechanism. The company claims that it has responded to all of the grievances. The breakup is as follows:
According to Meta, it provided tools for users to resolve the issue in 436 of the 531 total cases. These ‘tools’ include pre-established channels to report content for specific violations, self-remediation flows to download data and avenues to address hacked account issues among others. These tools were provided for the following cases:
Also, Meta claims that over 95 reports needed specialised review. Of these, actions were taken on 28 reports, with the company coming to the conclusion that the 67 reports left do not need action taken after a review process.
The post Meta Transparency Report: Over 99% Of Content Actions On Facebook Taken Proactively appeared first on Inc42 Media.