Connect with us

News

A journalist went undercover as a Facebook Mod, encouraged to keep toxicity on the site

The full findings will be revealed in a new documentary titled “Inside Facebook: Secrets of the Social Network.”

facebook private posts facebook groups
Image: Reuters

British broadcaster Channel 4 recently went undercover as a Facebook moderator. Unfortunately, it looks like the social network has a long way to go to remove toxic content from the site, if it even can, according to Business Insider.

During his time undercover, the Channel 4 journalist posed as an employee of CPL Resources, a Dublin, Ireland-based content-moderation contractor, which has worked with Facebook since 2010. While at CPL, the journalist received training, which is supposed to teach new staff members about Facebook’s community standards. From there, the reporter started reviewing content including images of graphic violence, child abuse, and hate speech.

As Business Insider explains,

Moderators were given three options when reviewing material: ignore, delete, or mark as disturbing. Content marked as disturbing remains on Facebook but has restrictions on who is able to view it.

The reporter found instances in which images of child abuse, racism, and violence were allowed to remain on Facebook. In some cases, the findings also exposed wild inconsistencies between the way moderators were being trained and Facebook’s standards.

Among the content allowed to fester on Facebook’s servers included video of a young boy beaten by an adult and a racist meme of a girl being drowned.

When contacted by Channel 4 about some of the things it found, Richard Allan, FB’s vice president of public policy, told Channel 4’s Krishnan Guru-Murthy that the video “should have been taken down.” Of the meme, Facebook said that the image did, in fact, violate its hate-speech policy and that it was “reviewing what went wrong to prevent it happening again.”

The undercover reporter also found instances of hate speech were permitted. A comment aimed at Muslim immigrants that said “f**k off back to your own countries,” for example, was allowed to remain on the site. Had the comment been aimed solely at Muslims, rather than Muslim immigrants, it apparently would have been deleted.

“People are debating very sensitive issues on Facebook, including issues like immigration. And that debate can be entirely legitimate,” Allan said in response to the comment. When pressed about whether it constituted hate speech, he said it’s “right on that line.”

Whether the company profits from toxic speech remains open to debate.

Roger McNamee, an early Facebook investor who has become a critic, says Facebook stands to benefit from the extreme content.

He explains: “It’s the really extreme, really dangerous form of content that attracts the most highly engaged people on the platform. Facebook understood that it was desirable to have people spend more time on site if you’re going to have an advertising-based business.”

Allan said that isn’t true, noting “Shocking content does not make us more money — that’s just a misunderstanding of how the system works,” he said.

Regardless, Allan said the company has reviewed training materials at contractors like CPL and subsequently provided refresher training courses for moderators.

Everything Channel 4 uncovered about the social site is presented in the new documentary, “Inside Facebook: Secrets of the Social Network.”

I can’t see how a company the size of Facebook can remove all the bad content from its site. That would be like demanding a large city to eliminate all crime. It’s impossible to do, no?

For more tech and social news, check out:

Bryan considers himself a well-rounded techie, having written articles for MakeUseOf, KnowTechie, AppAdvice, iDownload Blog. When he's not writing, he's being a single dad and rooting for his alma mater, Penn State, or cheering on the Patriots.

Comments

More in News