Connect with us

Hi, what are you looking for?

World

Facebook unveils appeal process for when it removes posts

-

Facebook said Tuesday it will give users the right to appeal decisions if the social network decides to remove photos, videos or written posts deemed to violate community standards.

Plans to roll out an appeals process globally in coming months came as Facebook provided a first-ever look at internal standards used to decide what posts go too far in terms of hateful or threating speech.

"This is part of an effort to be more clear about where we draw the line on content," Facebook public policy manager in charge of content Siobhan Cummiskey told AFP.

"And for the first time we're giving you the right to appeal our decisions on individual posts so you can ask for a second opinion when you think we've made a mistake."

The move to involve Facebook users more on standards for removing content comes as the social network fends off criticism on an array of fronts, including handling of people's data, spreading "fake news," and whether politics has tinted content removal decisions.

California-based Facebook already lets people appeal removal of profiles or pages. The appeal process to be built up during the year ahead will extend that right to individual posts, according to Cummiskey.

The new appeal process will first focus on posts remove on the basis of nudity, sex, hate speech or graphic violence.

Notifications sent regarding removed posts will include buttons that can be clicked to trigger appeals, which will be done by a member of the Facebook team. While software is used to help find content violating standards at the social network, humans will handle appeals and the goal is to have reviews done within a day.

"We believe giving people a voice in the process is another essential component of building a fair system," vice president of global product management Monika Bickert said.

"For the first time, we are publishing the internal implementation guidelines that our content reviewers use to make decisions about what's allowed on Facebook."

Some 7,500 content reviewers are part of a 15,000-person team at Facebook devoted to safety and security, according to Cummiskey, who said the team is expected to grow to 20,000 people by the end of this year.

"It's quite a tricky and complex thing drawing lines around what people can and cannot share on Facebook, which is why we consult experts," said Cummiskey, whose background includes work as a human rights attorney.

Facebook said Tuesday it will give users the right to appeal decisions if the social network decides to remove photos, videos or written posts deemed to violate community standards.

Plans to roll out an appeals process globally in coming months came as Facebook provided a first-ever look at internal standards used to decide what posts go too far in terms of hateful or threating speech.

“This is part of an effort to be more clear about where we draw the line on content,” Facebook public policy manager in charge of content Siobhan Cummiskey told AFP.

“And for the first time we’re giving you the right to appeal our decisions on individual posts so you can ask for a second opinion when you think we’ve made a mistake.”

The move to involve Facebook users more on standards for removing content comes as the social network fends off criticism on an array of fronts, including handling of people’s data, spreading “fake news,” and whether politics has tinted content removal decisions.

California-based Facebook already lets people appeal removal of profiles or pages. The appeal process to be built up during the year ahead will extend that right to individual posts, according to Cummiskey.

The new appeal process will first focus on posts remove on the basis of nudity, sex, hate speech or graphic violence.

Notifications sent regarding removed posts will include buttons that can be clicked to trigger appeals, which will be done by a member of the Facebook team. While software is used to help find content violating standards at the social network, humans will handle appeals and the goal is to have reviews done within a day.

“We believe giving people a voice in the process is another essential component of building a fair system,” vice president of global product management Monika Bickert said.

“For the first time, we are publishing the internal implementation guidelines that our content reviewers use to make decisions about what’s allowed on Facebook.”

Some 7,500 content reviewers are part of a 15,000-person team at Facebook devoted to safety and security, according to Cummiskey, who said the team is expected to grow to 20,000 people by the end of this year.

“It’s quite a tricky and complex thing drawing lines around what people can and cannot share on Facebook, which is why we consult experts,” said Cummiskey, whose background includes work as a human rights attorney.

AFP
Written By

With 2,400 staff representing 100 different nationalities, AFP covers the world as a leading global news agency. AFP provides fast, comprehensive and verified coverage of the issues affecting our daily lives.

You may also like:

Social Media

Wanna buy some ignorance? You’re in luck.

Tech & Science

Under new legislation that passed the House of Representatives last week, TikTok could be banned in the United States.

Life

Platforms like Instagram and Pinterest often suggest travel destinations based on your likes and viewing habits.

Social Media

From vampires and wendigos to killer asteroids, TikTok users are pumping out outlandish end-of-the-world conspiracy theories.