Facebooks parent company Meta is clamping down on inflammatory content with a new set of rulesbut is now willing to give persistent offenders numerous opportunities to see the error of their ways.

In its latest move, the social media company says it will explain why a user violated content policies up to seven times.

After the eighth offense, it will suspend the users account, sending them to Facebook jail, a term that users have coined to describe being banned from the social media platform. 

Meta announced its policy change on Thursday in response to feedback from its Oversight Board that also brought forward the social media sites shortcomings last December.

It said the new policy would prevent over-penalizing people when they have abided by Metas content regulations and lead to faster and more impactful action.

Under the new system, we will focus on helping people understand why we have removed their content, which is shown to be more effective at preventing re-offending, rather than so quickly restricting their ability to post, Monika Bickert, the vice president of content policy at Meta, wrote in a statement. 

In case of serious violations, where the content included terrorism, human trafficking or other inappropriate content, the account will face immediate action, including account removal, Bickert said. 

The vast majority of people on our apps are well-intentioned. Historically, some of those people have ended up in Facebook jail without understanding what they did wrong or whether they were impacted by a content enforcement mistake, according to Bickert.

The companys earlier policies swiftly placed month-long blocks on peoples accounts, even if their violations were minor or accidental.

While Metas current policy change gives more room for well-intentioned users to use the platform, in the past, the company had trouble with lax enforcement. 

In 2019, Brazilian footballer Neymar shared explicit images of a woman who had accused him of rape to his fanbase of many millions before Facebook took it down.

The same policies also allowed accounts to spread information found to be false about political figures like Hillary Clinton and Donald Trump. 

The Oversight Board, appointed in 2020, found last December that Metas cross-check program, which provides preferential treatment to VIP accounts, was structured to satisfy business interests and was deeply flawed.

The board made over 30 recommendations to Meta on enforcing fair and transparent policies.

It also said earlier this month that it has changed rules to allow expedited decision-making between two to 30 days when content policy has been violated.

In response to the new policy change, the Oversight Board said it welcomed the move but added that there was room for devising policies beyond this one, which addresses only less-serious violations.

Meta did not immediately return Fortunes request for comment.

Learn how to navigate and strengthen trust in your business with The Trust Factor, a weekly newsletter examining what leaders need to succeed. Sign up here.


Newspapers

Spinning loader

Business

Entertainment

POST GALLERY