
Meta, the parent company of Facebook and Instagram, recently released a report. The report detailed its efforts to combat harmful content on its platforms in India during May. The report states that approximately 25 million instances of problematic content were addressed. Note that it encompasses categories like spam, adult nudity, sexual activity, hate speech, violence, and incitement, among others.
Under the IT (Intermediary Guidelines and Digital Media Ethics Code) Rules, 2021, Meta's monthly report outlines the actions taken to remove such content across 13 policies for Facebook throughout May 1-31. The same is true for the 12 policies for Instagram.
According to the company's statement, they have acknowledged receiving complaints concerning objectionable content through their established communication channels. These channels are designed to address specific violations and allow users to self-remediate by downloading their data. Additionally, they also provide assistance with account hacking incidents. Furthermore, the company has also considered the directives issued by the Grievance Appellate Committee (GAC) while addressing these concerns.
Reports of 'bad content' received for Facebook
Meta (formerly popular as Facebook) reported that they received 16,995 reports through the Indian grievance mechanism. Note that they responded to every single one of these reports. In 2,325 cases, the company provided users with tools to address and resolve their issues.
For the remaining 14,670 reports that required specialized review, Meta conducted content evaluations based on their policies. They took action on a total of 2,299 reports that were found to be in violation. Meta reviewed the remaining 12,371 reports. However, action may not have been taken on these reports due to the reasons previously stated by the company.
Reports of 'bad content' received for Instagram
During the period from May 1 to May 31, on Instagram, the company received a total of 16,267 reports through the Indian grievance mechanism. Additionally, they made sure to respond to every report, addressing all concerns. Out of these reports, in 3,828 cases, the company provided users with relevant tools and assistance to help them resolve their issues effectively.
For the remaining 12,439 reports that required specialized review, the company took action on 2,671 of them, ensuring compliance with their policies. While the company reviewed the remaining 9,768 reports, there might have been reasons why action was not taken on all of them. It's important to note that social media platforms with a user base of over 5 million are mandated to publish monthly compliance reports, following the new IT Rules of 2021.
To read more about the latest tech and business related news and tips, you can visit Techmagazines and BuzzMagzine.