Connect with us

News

Facebook apologizes after its shitty AI labeled a video of a Black man as “about Primates”

The company calls this an “unacceptable error,” ensures that it’s working on a fix.

facebook logo with blurred facebook website in background
Image: KnowTechie

A Daly Mail video from 2020 has made quite the stir over the last few days after revealing a pretty major flaw in Facebook’s AI. The video featured a Black man as the main focus and Facebook’s shitty AI was asking people if they wanted to “keep seeing videos about Primates?”

According to a report from The New York Times, this video prompt was recently brought to Facebook’s attention, and the company has since apologized, calling it an “unacceptable error” and confirming that it is looking at the recommendation feature to make sure it doesn’t happen again. Here’s what Facebook spokesperson Dani Lever said to The New York Times:

“As we have said, while we have made improvements to our AI, we know it’s not perfect, and we have more progress to make. We apologize to anyone who may have seen these offensive recommendations.”

It is unclear exactly how long this prompt has been attached to this particular video, but former Facebook employee Darci Groves recently found the “horrifying and egregious” error and brought it to the company’s attention on a forum for current and former Facebook employees.

Artificial intelligence has found itself under some scrutiny and this isn’t the first time that we’ve seen unacceptable mistakes like this one on social media. Just a few months ago, Twitter abandoned its photo-cropping AI after it was discovered to be extremely racially biased.

These companies are beginning to use AI more and more on a daily basis and something needs to be done about these disgusting issues. With the number of resources that Facebook has accumulated, there’s no justification for sickening mistakes like this one.

Have any thoughts on this? Let us know down below in the comments or carry the discussion over to our Twitter or Facebook.

Editors’ Recommendations:

Comments

More in News