Civil rights groups: Facebook should protect, not censor, human rights issues
A coalition of more than 70 civil rights groups have written to Facebookdemanding that the company clarifies its policies for removing content and alleging that it has repeatedly removed posts documenting human rights violations.
In a letter addressed to CEO Mark Zuckerberg, the rights groups – including the ACLU, Center for Media Justice, SumOfUs and Color of Change – express deep concern over the censorship, particularly when posts are removed at the request of police.
When Facebook censors content that depicts police brutality, it sets a dangerous precedent for marginalized communities
“Your company is taking on an increasingly central role in controlling media that circulates through the public sphere. News is not just getting shared on Facebook: it’s getting broken there,” the letter said. “We are deeply concerned with the recent cases of Facebook censoring human rights documentation, particularly content that depicts police violence.”
The campaign groups referenced the deactivation of Korryn Gaines’ accountduring a standoff with police, the suspension of live footage from the Dakota Access pipeline protests, the removal of historic photographs such as “napalm girl”, the disabling of Palestinian journalists’ accounts and reports of Black Lives Matter activists’ content being removed.
“When the most vulnerable members of society turn to your platform to document and share experiences of injustice, Facebook is morally obligated to protect that speech,” said the letter.
“When Facebook unilaterally censors user content that depicts police brutality at the request of the authorities, it sets a dangerous precedent that further hurts and silences marginalized communities, particularly communities of color.”
The signatories point out that Facebook’s public image is one of inclusiveness and solidarity, illustrated by features such as the safety check-in and solidarity filters for profile pictures.
“However, Facebook’s repeated silencing of marginalized communities that attempt to make their stories and struggles known proves otherwise.”
“From Black Lives Matter in the United States, to journalists in Palestine, Facebook’s lack of transparency has resulted in reports of censorship on almost a weekly basis, which proves that this is not an individualized ‘glitch’ but a broader policy problem,” said Nicole Carty, a campaigner at SumOfUs.
The letter comes a week after Facebook announced that it will allow newsworthy content on its platform even if it might otherwise violate the company’s community standards. The change of heart comes in the wake of a lengthy dispute over a celebrated Vietnam war photo that violated Facebook’s rules preventing child nudity.
In early September, Facebook deleted the photo of a naked girl fleeing a napalm attack called The Terror of War (but nicknamed “napalm girl”) from the page of writer Tom Egeland and suspended his account because child nudity is not allowed on the platform. When the Norwegian newspaper Aftenposten reported on Egeland’s battle with the social network using the same photo, it was censored, too. The newspaper ran a front-page letter criticizing Facebook and describing Zuckerberg as the “world’s most powerful editor”. The situation escalated to the prime minister, Erna Solberg, who said Facebook was “editing out history”.
Despite Facebook’s change in policy to adopt more of an editorial decision-making process, its executives insist it is not a media company. Speaking at the WSJD Live last week, chief product officer Chris Cox was asked whether Facebook is a news company.
“If you look at how we’ve defined ourselves internally for 12 years now, it has been a technology company. A media company is about the stories that it tells. A technology company is about the tools that it builds,” he said.
However, he added, “we also realize that we’ve become a significant part of the way a lot of people get information” and “that comes with a huge responsibility” that Facebook takes “very seriously”.
“Facebook is walking in the direction of becoming a media company, and with that comes a level of responsibility,” said Chinyere Tutashinda from the Center for Media Justice. The need for transparency and accountability over the content it removes is even more pressing with the rise of Facebook’s live video streaming, she said.
“It’s not just a platform where people are getting news, but it’s increasingly a platform where people are documenting human rights injustices and breaking news.”
Tutashinda said that with both “napalm girl” and more recent cases of censorship, it comes down to Facebook’s policies. “It’s clear they don’t take into account historical documentation, civil rights documentation or human rights documentation,” she said.
A Facebook spokeswoman said: “We have received the letter and are reviewing it. As we recently said, we welcome feedback from our community as we begin allowing more items that people find newsworthy, significant, or important to the public interest.”