PhotoA former Facebook manager wrote a post on the platform describing how the site censors black users and fails black employees. Then Facebook removed the post, telling Mark Luckie that it violated “community standards.”

Luckie recently stepped down from his job as strategic partner manager at Facebook. During his run at the company, Luckie says he found that Facebook was “failing its black employees and its black users.”

Black employees make up just 4% of the workforce at Facebook.  According to Luckie’s original Facebook post, he and others were regularly “accosted by campus security,” as well as discriminated against in more subtle ways at meetings and in other professional venues.

Posts that black users make discussing racism or related issues are also regularly censored at the request of non-blacks, Luckie said.

“Black people are finding that their attempts to create ‘safe spaces’ on Facebook for conversation among themselves are being derailed by the platform itself,” Luckie wrote. “Non-black people are reporting what are meant to be positive efforts as hate speech, despite them often not violating Facebook’s terms of service.”

“Protected categories”

Shortly after Luckie’s post went live on November 27, he received a generic notice from Facebook that it had been removed.

“In an ironic twist, I am dealing with this,” Luckie told the Guardian. Facebook’s press team responded that “we are looking into what happened.” The post subsequently returned online.

ProPublica previously found that Facebook considers only certain people to be “protected categories” when deciding whether to delete a post that promotes hate or that targets a specific group.

For example, a post last year by a Black Lives Matter activist saying that “all white people are racist” was deleted by the company because white people are considered to be a protected category on Facebook, according to internal documents viewed by ProPublica.

But a lawmaker’s post calling for people to “hunt” and kill “radicalized” Muslims was not removed because the target of the post -- Muslims who were “radicalized” -- are not a protected category under Facebook’s rules.

While Muslims are protected because they are a religion, “radicalized” Muslims are considered a “subset” category by Facebook, allowing for looser guidelines when it comes to questionable postings.


Share your Comments