In September 2021, Meta commissioned BSR to review the human rights impacts of the company’s policies and activities during the May 2021 crisis in Israel and Palestine. Today, we are publishing the results of BSR’s analysis in Arabic, English, and Hebrew.
BSR would like to thank everyone that participated in this review and provided their valuable time, insights, and perspectives.
The discussions we held with affected stakeholders to inform this review brought home to us how so many detailed decisions about social media policy, technology, and practice have fundamental impacts on the protection, realization, and fulfillment of human rights as a common standard of achievement for all peoples—especially in the context of highly complex social and historical dynamics. We hope this review informs actions that result in meaningful improvements in the daily lives of all people connected to Israel and Palestine.
The primary purpose of the human rights due diligence is to provide Meta with prioritized, action-oriented, decision-useful, and forward-looking recommendations for policies and practices. In doing so, this human rights due diligence helps fulfill Meta’s commitments under its Corporate Human Rights Policy and responsibilities under the United Nations Guiding Principles on Business and Human Rights (UNGPs).
Specifically, Principle 20 of the UNGPs states that companies should track the effectiveness of their response to human rights impacts by engaging with stakeholders, integrating findings into relevant processes, and driving continuous improvement. Principle 22 states that companies should provide for or cooperate in the remediation of adverse impacts, including seeking to guarantee non-repetition of prior harms.
This human rights due diligence also helps fulfill the recommendation of the Meta Oversight Board that Meta should engage an independent entity not associated with either side of the Israeli-Palestinian conflict to determine whether Meta’s content moderation in Arabic and Hebrew has been applied without bias.
BSR found that Meta took many appropriate actions during the May 2021 crisis, including establishing a special operations center and crisis response team, prioritizing risks of imminent offline harm, seeking an approach to content removal and visibility based on necessary and proportionate restrictions consistent with the International Covenant on Civil and Political Rights (ICCPR) Article 19(3), and overturning policy enforcement errors in response to user appeals. For this reason, some of BSR’s recommendations build upon important foundations for a human rights-based approach to content governance that have already been established by Meta.
However, BSR also identified a variety of adverse human rights impacts for Meta to address, including impacts on the rights of Palestinian users to freedom of expression and related rights, the prevalence of anti-Semitic content on Meta platforms, and instances of both over-enforcement (erroneously removed content and erroneous account penalties) and under-enforcement (failure to remove violating content and failure to apply penalties to offending accounts).
BSR did not identify intentional bias at Meta, but did identify various instances of unintentional bias where Meta policy and practice (such as insufficient routing of Arabic content by dialect or regional expertise), combined with broader external dynamics (such as efforts to comply with US law), leads to different human rights impacts on Palestinian and Arabic-speaking users.
BSR has made 21 recommendations to Meta to address these adverse human rights impacts and bias.
Four recommendations relate to content policy, such as reviewing Meta’s policies relating to content that praises or glorifies violence, and specific elements of Meta’s Dangerous Individuals and Organizations policy.
Four recommendations relate to transparency, such as increasing the breadth, specificity, and granularity of information provided to users about Meta content policy enforcement, such as when action is taken on their content or accounts.
Ten recommendations relate to operations, such as determining the market composition (e.g., headcount, language, location) needed for rapid response capacities, the routing of potentially violating Arabic content to reviewers by dialect and region, improving classifiers (algorithms that assist with content moderation by identifying and sorting content that may violate Meta’s content policies), developing mechanisms to track hate speech based on type, and enhancing content moderation quality control processes to prevent large-scale errors.
Finally, three recommendations relate to systems change that goes beyond just Meta, such as how counterterrorism law applies to the social media industry and support for access to remedy.
The UNGPs lay out the expectation that Meta should avoid infringing on the human rights of others and should address adverse human rights impacts with which it is involved. In a conflict-affected context like Israel and Palestine, this includes understanding how ongoing conflict dynamics intersect with Meta’s platforms, how the online actions of a range of actors are possibly shaping offline events, and which groups are particularly vulnerable to adverse human rights impacts connected to Meta’s platforms given the conflict context.
We believe that BSR’s human rights due diligence will help Meta fulfill these expectations, and we look forward to Meta making progress reviewing and implementing our recommendations. We also hope that the insights we’ve shared today can inform the efforts of policy makers and other social media companies addressing complex challenges of content governance globally, especially in conflict-affected contexts.