Connect with us

Hi, what are you looking for?

World

Facebook to launch independent body for calls on content

-

Facebook announced Thursday it is creating an independent body to make potentially precedent-setting calls on what content should be yanked from the social network.

The announcement came as Facebook reported it has ramped up its ability to quickly detect "hate speech" and other posts violating community rules, with the leading social network under pressure from regulators in various countries and activists to root out abusive and inappropriate content.

"I have come to believe that we shouldn't be making so many decisions about free expression and safety on our own," Facebook chief executive Mark Zuckerberg said in a media briefing.

Content spied by artificial intelligence software or reported by users is now reviewed by an internal system that Facebook has been ramping up.

An independent body to be constituted in the coming year will act as a "higher court" of sorts, considering appeals of content removal decisions made by the social network, Zuckerberg said.

The composition of the appeals body along with how to keep it independent while remaining in line with Facebook principles and policies was to be determined in the coming year.

Facebook also planned next year to begin releasing content removal summaries quarterly in a tempo on par with earnings reports, according to executives.

"We have made progress getting hate, bullying and terrorism off our network," Zuckerberg said.

"It's about finding the right balance between giving people a voice and keeping people safe."

Challenges faced by the California-based social network include the fact that people naturally tend to engage with more sensational content that, while perhaps at the edge of violating Facebook policies, are unhealthy for civilized discourse, according to Zuckerberg.

"We see this in cable news and tabloids too," Zuckerberg said.

"A lot of our work is to insure that borderline content that comes close to violating our content gets less attention not more."

Bullying represents a tougher challenge to AI systems, because it tends to be personal and subjective. For example, someone might playfully mock a friend in a post that could also be interpreted to be mean.

Detecting bullying or hate can also require understanding of the gamut of languages used at Facebook, along with cultural contexts.

"We are getting better at proactively identifying violating content before anyone reports it, specifically for hate speech and violence and graphic content," Facebook said in the new transparency report.

"But, there are still areas where we have more work to do."

Facebook said that since its last transparency report, the amount of hate speech detected proactively, before anyone reported it, has more than doubled.

"The single biggest improvement comes from AI and machine learning," said product management vice president Guy Rosen.

Facebook announced Thursday it is creating an independent body to make potentially precedent-setting calls on what content should be yanked from the social network.

The announcement came as Facebook reported it has ramped up its ability to quickly detect “hate speech” and other posts violating community rules, with the leading social network under pressure from regulators in various countries and activists to root out abusive and inappropriate content.

“I have come to believe that we shouldn’t be making so many decisions about free expression and safety on our own,” Facebook chief executive Mark Zuckerberg said in a media briefing.

Content spied by artificial intelligence software or reported by users is now reviewed by an internal system that Facebook has been ramping up.

An independent body to be constituted in the coming year will act as a “higher court” of sorts, considering appeals of content removal decisions made by the social network, Zuckerberg said.

The composition of the appeals body along with how to keep it independent while remaining in line with Facebook principles and policies was to be determined in the coming year.

Facebook also planned next year to begin releasing content removal summaries quarterly in a tempo on par with earnings reports, according to executives.

“We have made progress getting hate, bullying and terrorism off our network,” Zuckerberg said.

“It’s about finding the right balance between giving people a voice and keeping people safe.”

Challenges faced by the California-based social network include the fact that people naturally tend to engage with more sensational content that, while perhaps at the edge of violating Facebook policies, are unhealthy for civilized discourse, according to Zuckerberg.

“We see this in cable news and tabloids too,” Zuckerberg said.

“A lot of our work is to insure that borderline content that comes close to violating our content gets less attention not more.”

Bullying represents a tougher challenge to AI systems, because it tends to be personal and subjective. For example, someone might playfully mock a friend in a post that could also be interpreted to be mean.

Detecting bullying or hate can also require understanding of the gamut of languages used at Facebook, along with cultural contexts.

“We are getting better at proactively identifying violating content before anyone reports it, specifically for hate speech and violence and graphic content,” Facebook said in the new transparency report.

“But, there are still areas where we have more work to do.”

Facebook said that since its last transparency report, the amount of hate speech detected proactively, before anyone reported it, has more than doubled.

“The single biggest improvement comes from AI and machine learning,” said product management vice president Guy Rosen.

AFP
Written By

With 2,400 staff representing 100 different nationalities, AFP covers the world as a leading global news agency. AFP provides fast, comprehensive and verified coverage of the issues affecting our daily lives.

You may also like:

World

When Joe Biden meets world leaders at a lavish Italian resort he will be shadowed by an invisible and, for now, uninvited guest: Donald...

Business

US consumer inflation data is unlikely to sway the Federal Reserve, which is widely expected to leave its key lending rate unchanged.

Life

Toshihide Takase, 76, says he is "the only person in the world" breeding this specific delicacy after four decades of trial and error.

Business

The pioneering innovation conference wraps up its second day in Calgary before moving east to Toronto for its fall event.