CSOs Call on Facebook to Provide Mechanism for Users to Appeal Content Restrictions

0
197
Mark Zuckerber, CEO of Facebook
Mark Zuckerber, CEO of Facebook

About 100 civil society organisations (CSOs) from across the globe, including Media Rights Agenda (MRA), have called on Facebook to provide a mechanism for all of its users to appeal content restrictions, and  to have the appealed decision re-reviewed by a human moderator in all cases.

In an open letter to Mark Zuckerber, CEO of Facebook, the CSOs called his attention to incidents of misapplication of Facebook’s Community Standards to some of its users who were able to get their contents restored but noting that: “For most users, content that Facebook removes is rarely restored and some users may be banned from the platform.

They reminded Zuckerberg that when Facebook was launched, users who violated its rules and had their content removed or their accounts deactivated were sent a message telling them that the decision was final and could not be appealed.

Tracing the timelines of events they noted that it was only in 2011, after years of advocacy from human rights organizations, that Facebook added a mechanism to appeal account deactivations, and in 2018 initiated a process for remedying wrongful takedowns of certain types of content.

They pointed out to Zuckerberg that Facebook, with a stated mission of giving give people the power to build community and bring the world closer together and having more than two billion users and a wide variety of features, is the world’s premier communications platform, reminding him that he has the responsibility to prevent abuse and keep users safe.

They also reminded him that social media companies, including Facebook, have a responsibility to respect human rights, and international and that regional human rights bodies have a number of specific recommendations for improvement, notably concerning the right to remedy.

They pointed out that from years of research and documentation, human content moderators, as well as machine learning algorithms, are prone to error, noting that even low error rates can result in millions of silenced users when operating at massive scale. Of concern to them is that Facebook users are only able to appeal content decisions in a limited set of circumstances, and it is impossible for users to know how pervasive erroneous content takedowns are without increased transparency on Facebook’s part.

While acknowledge that Facebook can and does shape its Community Standards according to its values, they pointed out that it nevertheless has a responsibility to respect its users’ expression to the best of its ability.

Reminding the internet giant that civil society groups around the globe have criticized the way its Community Standards exhibit bias and are unevenly applied across different languages and cultural contexts, they noted that offering a remedy mechanism, as well as more transparency, will go a long way toward supporting user expression.

They cited the Santa Clara Principles on Transparency and Accountability in Content Moderation, which recommend a set of minimum standards for transparency and meaningful appeal consistent with the work of the UN Special Rapporteur on the promotion of the right to freedom of expression and opinion David Kaye, who recently called for a “framework for the moderation of user- generated online content that puts human rights at the very center.” They added that it is also consistent with the UN Guiding Principles on Business and Human Rights, which articulate the human rights responsibilities of companies.

The CSOs asked Facebook to incorporate the Santa Clara Principles into their content moderation policies and practices.

In line with the Santa Clara Principles, the CSOs called on Facebook to provide notifications that explain to users why their content has been restricted, including the specific clause from the Community Standards that the content was found to violate; be sufficiently detailed to allow the user to identify the specific content that was restricted and should include information about how the content was detected, evaluated, and removed and give individual clear information about how to appeal the decision.

They also asked Facebook to provide users with a chance to appeal content moderation decisions; that appeals mechanisms should be easily accessible and easy to use; be subject to review by a person or panel of persons that was not involved in the initial decision; that appeals should result in a prompt determination and reply to the user etc.

They also called on Facebook to issue regular transparency reports on Community Standards enforcement; present complete data describing the categories of user content that are restricted (text, photo or video; violence, nudity, copyright violations, etc), as well as the number of pieces of content that were restricted or removed in each category etc.