Here’s How Facebook Determines What Hate Speech Looks Like – Newsy

Posted By on June 28, 2017

Facebook's content rules are under scrutiny again. A recent ProPublica report gives greater insight about how Facebook determines who to protect from hate speech on the platform.

The outletobtained an internal presentationoutlining Facebook's content rules. Basically, it protects some groups of people from harassment while leaving posts targeting other groups alone.

According to its policy, Facebook can remove posts that attack groups based on race, sex, gender identity, sexual orientation, religion, national origin, ethnicity and serious disability or disease.

Related StoryArtificial Intelligence Is Facebook's New Terrorism Watchdog

Non-protected subsets of protected groups are fair game for targeting. One slide illustrated that a post attacking all white men should be taken down, while attacks on women drivers or black children are to be left up.

This is justthelatest publicationof Facebook's internal documents about content moderation, including how they dealt with tricky legal situations like Holocaust denial and online extremism.

Offensive content is increasingly putting Facebook and other social media giants at odds with governments around the world. A billrecently proposedin Germany would fine companies $53 million if they don't remove content quickly enough.

The problem is only going to get more complex as Facebook's user base grows the site recently passed2 billionmonthly users.

See the rest here:

Here's How Facebook Determines What Hate Speech Looks Like - Newsy

Related Posts

Comments

Comments are closed.

matomo tracker