The Google recognition algorithm still believes that some black people are gorillas. There are issues such as the racist Google algorithm in 2015. In June of that year, Jacky Alciné published a tweet that went viral, showing how Google Photos detected the face of some friends like gorillas for being black. As expected, Google reacted to this issue of racism by clarifying that it had been solved.
In fact, Google has not solved anything. And does not seem to have an answer to one of the most intriguing questions we have to ask ourselves. Does the artificial intelligence racist or are the programmers that develop it?
According to Wired Google has not found a solution to make IA understands that a person of black color is not a gorilla. What the company has done is eliminate terms like the gorilla from the searches.
That is the only way that Google has found one of the leading companies in facial recognition and artificial intelligence. Just remove the words with which they were related from their vocabulary.
Results were also shown showing “black man” or “black woman.” Results of black men and women do not identify in their application.
Wired tried to see if the algorithm of Google had changed and for it raised hundreds of photos of animals. That is, showing that the results of gorilla, chimpanzee, and monkey had been blocked. Other searches for primates such as orangutan or baboon worked well.
Yonatan Zunger, engineer at that time of Google, commented that Google was already working on fixing this type of failures. Three years later, there is no solution.
We are working on long-term solutions for linguistics and image recognition,” Zunger commented in a series of Tweets in June 2015.
According to Google, “image tagging technology is still in its infancy and is not perfect.” The same company that is capable of making a machine know how to recognize roads, obstacles, pedestrians, and signs of all kinds for its cars without a driver is unable to fix something like racism by the similarity in a flat photograph.