[vc_row][vc_column]

[/vc_column][/vc_row]

Fb Apologizes After A.I. Places ‘Primates’ Label on Video of Black Males

Facebook users who recently watched a UK tabloid video featuring black men saw an automatic prompt from the social network asking if they “want to keep watching videos about primates,” prompting the company, the artificial intelligence-based one Function to investigate and disable the squeezed the message.

On Friday, Facebook apologized for a so-called “unacceptable mistake” and said it was reviewing the referral feature to “prevent this from happening again.”

The June 27, 2020 video was from The Daily Mail and showed clips of black men clashing with white civilians and police officers. It had no association with monkeys or primates.

Darci Groves, a former content design manager at Facebook, said a friend recently sent her a screenshot of the solicitation. She then published it on a product feedback forum for current and former Facebook employees. In response, a product manager at Facebook Watch, the company’s video service, called it “unacceptable” and said the company was “looking for the cause”.

Ms. Groves said the request was “terrible and egregious”.

Dani Lever, a Facebook spokeswoman, said in a statement: “As I said, although we’ve made improvements to our AI, we know it’s not perfect and we need to make further progress. We apologize to everyone who has seen these offensive recommendations. “

Google, Amazon, and other tech companies have been researching biases in their artificial intelligence systems for years, particularly regarding race issues. Studies have shown that facial recognition technology is biased towards people of color and has more difficulty identifying them, leading to incidents where blacks have been discriminated against or arrested because of computer errors.

Credit…

In a 2015 example, Google Photos incorrectly labeled images of black people as “gorillas,” which Google said was “truly sorry” and would work to fix the problem immediately. More than two years later, Wired discovered that Google’s solution was to censor the word “gorilla” from searches while blocking “chimpanzee”, “chimpanzee” and “monkey” at the same time.

Facebook has one of the world’s largest repositories for user-uploaded images that are used to train its face and object recognition algorithms. The company, which tailors content for users based on their previous browsing and viewing habits, sometimes asks people if they want to keep seeing posts in related categories. It was unclear whether news like the “primates” was widespread.

Facebook and its photo-sharing app Instagram are grappling with other race-related issues. For example, after the European Football Championship in July, three black members of the English national football team were racially insulted on the social network for missing penalties in the championship game.

Racist problems have also led to internal unrest on Facebook. In 2016, CEO Mark Zuckerberg urged employees to remove the phrase “Black Lives Matter” and replace it with “All Lives Matter” in a common room at the company’s headquarters in Menlo Park, California. Hundreds of employees also held a virtual strike last year to protest the company’s handling of a post from President Donald J. Trump about the assassination of George Floyd in Minneapolis.

The company later hired a vice president for civil rights and published a civil rights review. In an annual diversity report in July, Facebook said 4.4 percent of its US-based employees were black, up from 3.9 percent the previous year.

Ms. Groves, who left Facebook after four years, said in an interview that a number of missteps at the company suggested that dealing with racial issues was not a priority for its executives.

“Facebook can’t keep making these mistakes and then saying, ‘I’m sorry,'” she said.

Comments are closed.