(Photo : Facebook)
Meta
In a recent disclosure, Meta, the social media behemoth, revealed that it received a staggering 47,538 reports through the Indian grievance mechanism for its platforms, Facebook and Instagram, in September. The company has responded to all these reports, demonstrating its commitment to user safety and adherence to the new IT Rules, 2021. The reports received by Meta were not just acknowledged but also acted upon. For Facebook, out of the 33,422 reports received, the company provided tools to users to resolve their issues in 21,496 cases.
Meta's Proactive Approach and Specialized Review
These tools encompass a wide range of solutions, including pre-established channels to report content for specific violations, self-remediation flows where users can download their data, and avenues to address account hacked issues. This proactive approach by Meta is part of its monthly compliance report under the new IT Rules, 2021. However, not all reports required the same kind of attention. There were 11,926 reports that needed specialized review. Meta, adhering to its policies, reviewed the content and took action on 8,517 reports in total. The remaining 3,409 reports were reviewed but may not have been actioned due to various reasons, which the company did not specify.
Actioned Content and Instagram's Scenario
When Meta refers to actioned content, it implies a range of measures that could be taken. This could mean removing the piece of content from Facebook or Instagram, covering photos or videos that may be disturbing to some audiences with a warning, or disabling accounts. The action taken is determined by the nature of the report and the violation it represents. The scenario was similar on Instagram. The company received 14,116 reports through the Indian grievance mechanism in September, and responded to 100 per cent of those reports. Of these incoming reports, the company provided tools for users to resolve their issues in 7,219 cases. The remaining 6,897 reports required specialized review. Meta reviewed the content as per its policies and took action on 3,965 reports in total. The remaining 2,932 reports were reviewed but may not have been actioned.
Interestingly, there was no order received from the Grievance Appellate Committee (GAC) during September. This could be indicative of the effectiveness of Meta's internal grievance redressal mechanism, or it could simply be a lull before a storm. Meta's report is a testament to its efforts to remove harmful content from Facebook and Instagram. It demonstrates the company's continued commitment to making Facebook and Instagram safe and inclusive. This is not the first time a social media giant has had to deal with such a volume of reports. In the past, companies like Twitter and Google have also faced similar situations, and their handling of these situations has shaped the policies we see today.
However, it's important to note that while Meta has taken steps to address these reports, the sheer volume of reports indicates the scale of the problem. It also raises questions about the effectiveness of the measures in place to prevent such violations in the first place. In conclusion, Meta's disclosure of the number of reports it received and acted upon is a step towards transparency. It shows the company's commitment to user safety and its willingness to comply with local laws. However, it also highlights the challenges social media platforms face in moderating content and ensuring user safety. It's a delicate balance that these platforms must maintain, and the effectiveness of their measures will be seen in the decrease in the number of such reports in the future.