Technology pioneer Google is facing backlash yet again for a racially offensive bug in one of their new image recognition applications.
The company recently introduced new photo identification software that automatically labels people or objects present in pictures when submitted by users, but the intelligence level of the program is unfortunately not without prejudice, as was evident when one consumer posted a screenshot of the program identifying two of his Black friends as “gorillas” in one of his photos.
Google Photos, y'all fucked up. My friend's not a gorilla. pic.twitter.com/SMkMCsNVX4
— #NYRN bih (@jackyalcine) June 29, 2015
Per the usual, a Google spokesperson quickly responded to the almost instant uproar over the incident, offering a public apology and reassuring everyone that it won’t happen again.
“We’re appalled and genuinely sorry that this happened,” Google spokeswoman Katie Watson said in a statement. “We are taking immediate action to prevent this type of result from appearing.”
Earlier this year, Google came under fire for a similarly offensive incident when a search of “N**** House” on Google maps directed search engine users to the White House.
It’s certainly not uncommon for these automated technology applications to make a mistake or two here and there, but the information produced by these apps have to be input by a human at some point, right? That being said, Google should probably spend more time vetting their programmers before or during the creation process so they can spend less time apologizing after the damage is already done.