Channels
Log in register
piqd uses cookies and other analytical tools to offer this service and to enhance your user experience.

hand-piqd journalism

Curators from journalism, science and politics recommend and comment on the web's best content.

You are currently in channel:

Technology and society

Magda Skrzypek
Media development worker

Prague-based media development worker from Poland with a journalistic background. Previously worked on digital issues in Brussels. Piqs about digital issues, digital rights, data protection, new trends in journalism and other tech-focused topics.

View piqer profile
piqer: Magda Skrzypek
Monday, 17 July 2017

New Information On Facebook Moderation Policies Demonstrates Racial Bias

After a series of reports on Facebook moderation guidelines published by the Guardian, ProPublica follows suit. The report based on a “trove of internal documents” offers more information on complex standards outlining who is and who isn't protected from hate speech by the social media giant. The article describes how Facebook distinguishes between hate speech and legitimate political expression, exposing the rationale leading to frequently inconsistent verdicts.

For example, we are presented with a training slide asking whether Facebook protects female drivers, black children or white men? The correct answer is white men. Bizarre as it may seem, Facebook determines only some characteristics, which include race, sex, gender, and religious affiliation, as “protected categories”. Say something about the members of a protected category, such as men, and your post might be removed. Slurs against children, on the other hand, will inspire no action from the social network, as age is a non-protected category. This opaque logic gets even more dizzying when Facebook’s users start to combine different categories, creating "subsets" of protected categories.

"White men are considered a group because both traits are protected, while female drivers and black children, like radicalized Muslims, are subsets because one of their characteristics is not protected," ProPublica explains.

Blatant unfairness is what strikes the most. As the article points out, “the documents suggest that, at least in some instances, the company’s hate-speech rules tend to favor elites and governments over grassroots activists and racial minorities.” By overlooking the “non-protected categories”, Facebook seems to fail to safeguard vulnerable groups but instead shields those who are already in power.

New Information On Facebook Moderation Policies Demonstrates Racial Bias
7.5
2 votes
relevant?

Comments are a "members-only-feature" - become a member for just 3$ a month.

Stay up-to-date – with a newsletter from your channel on Technology and society.