Facebook users who recently watched a video from a British tabloid consisting of black men, received an automatic prompt from the social media platform that asked if they would prefer to “keep seeing videos about Primates,” prompting the firm to investigate and disable the Facebook AI-powered feature that propelled the message.
Furthermore, on Friday, Facebook apologized for what it described as “an unacceptable error” and stated it was looking into the support feature to “prevent this from occurring again.” The said video of black men, dated June 27, 2020, was shared by The Daily Mail and featured clips of black men in arguments with white civilians and policemen. It had no link to monkeys or primates.
The Prompt of Facebook AI is Unacceptable
Moreover, Darci Groves, a former content design supervisor at Facebook, stated that a friend had recently sent her a screenshot of the prompt. She then shared it to a product feedback forum for present and former Facebook workers. In response to the query, a product manager for Facebook Watch, the firm’s video service, pronounced it “unacceptable” and told that the company was “looking into the root cause.”
Ms. Groves declared the prompt was “horrifying and egregious.” Dani Lever, a Facebook spokeswoman, announced in a statement: “As we have said, while we have made improvements to our A.I., we know it’s not perfect, and we have more progress to make. We apologize to anyone who may have seen these offensive recommendations.”
Google Marked Images of Black Men as Gorillas
In addition to this, Google, Amazon, and other tech firms have been scrutinized for years for prejudices within their artificial intelligence systems, especially around problems of race. Researches have revealed that facial identification technology is biased towards the people of color and has more difficulty classifying them, directing to events where black men have been discriminated against or detained because of computer error.
For instance, back in 2015, Google Photos mistakenly marked images of black men as “gorillas,” for which Google declared it was “genuinely sorry” and would work to fix the problem quickly. More than two years later, Wired discovered that Google’s answer was to censor the term “gorilla” from searches, while further blocking “chimp,” “chimpanzee” and “primate.”
Facebook Cannot Just Apologize for Its Failures
Facebook has one of the world’s greatest treasuries of user-uploaded pictures on which to practice its facial- and object-recognition algorithms. The firm, which shapes content to users’ past searching and viewing habits, sometimes asks people if they would like to continue viewing posts under similar categories. It was unclear whether prompts like the “primates” one was widespread.
Moreover, Ms. Groves, who left Facebook over the summer after four years, stated in an interview that a series of failures at the firm insinuated that dealing with racial difficulties was not of precedence for its directors. “Facebook can’t keep making these mistakes and then saying, ‘I’m sorry,’” she said.
Source: The Verge