Kirill Kudryavtsev / AFP via Getty Images
Facebook will expand its current harassment policies to further protect users from abuse and harmful content on the platform.
On Wednesday, the company announced it would ban content that degrades or sexualizes public figures, such as elected officials, celebrities, activists and journalists. This builds on current company policies that exist to protect ordinary users in the same way.
Facebook said in its announcement that it would remove “serious sexual content” and certain other types of content used to sexually harass these public figures.
The company said, “Because what is ‘unwanted’ can be subjective, we will rely on the additional context of the abused person to take action. We have made these changes because attacks like these these can turn a public figure’s appearance into a weapon, which is unnecessary and often unrelated to the work these public figures represent. “
As part of its new policy, Facebook will also remove coordinated mass bullying and harassment from multiple users. These types of targeted harassment campaigns are used to attack government dissidents, the company said.
“We will also remove objectionable content considered mass harassment of anyone from personal surfaces, such as direct messages in the inbox or comments on personal profiles or posts,” Facebook said.
To combat these assaults, the social media platform will suppress related and state-sponsored organizations using private groups to coordinate mass posting on the profiles of government critics.
For example, Manal al-Sharif, a well-known activist who lobbied for women to drive in Saudi Arabia, said in 2018 that she had to suppress Twitter and Facebook due to the harassment she was facing from of “pro-government crowds,” according to The Guardian.
Facebook has recently come under fire following the interview with whistleblower Frances Haugen and testimony from Congress. In addition to Haugen’s testimony, important reports from The Wall Street Journal, who used collection documents leak, suggested that Facebook withheld research into the negative effects of its platform on adolescent mental health.
The company said the research was taken out of context.
Concerns and allegations remain about the site’s inability or unwillingness to deal with disinformation.
Haugen testified that the company is fueling division among users by allowing disinformation on the platform to go unchecked.
She shared her view that Facebook’s algorithms could fuel tensions and fuel ethnic violence, especially in Ethiopia. The government of the country and the rebels in Tigray are engaged in a civil war.
Hundreds of thousands of people are facing starvation due to the conflict between the Ethiopian government and rebels in Tigray. Zecharias Zelalem, a journalist covering the region and its conflict, recently told NPR that “Significant posters on Facebook published often inflammatory messages or speeches that were unverified that would then incite mob violence, ethnic clashes, repression of the independent press, or outspoken voices.”
“My fear is that without action, the divisive and extremist behavior that we see today is just the beginning,” Haugen told Congress. “What we saw in Myanmar and what we are now seeing in Ethiopia are just the first chapters of a story so terrifying that no one wants to read the end of it.”
Editor’s Note: Facebook is one of the financial backers of NPR.