Google has announced that its computer vision algorithm will no longer tag photos with gender. According to an email sent to Developers yesterday, the AI-powered tool will no longer use gendered tags like “woman” or “man,” and will default to “person” instead.
The change applies to the Cloud Vision API, which developers can use to tag photos based on what the computer “sees” inside the frame. According to Business Insider, who was sent a copy of the email in question, that API has now been changed in order to avoid potential bias.
“Given that a person’s gender cannot be inferred by appearance,” reads the email, “we have decided to remove these labels in order to align with the Artificial Intelligence Principles at Google, specifically Principle # 2: Avoid creating or reinforcing unfair bias.”
Testing the API our for ourselves reveals that the change has already taken effect:
The kind of bias Google is referring to is the result of “flawed training data,” which necessarily leads to the algorithm making certain assumptions. Anybody who doesn’t fit within that algorithm’s trained binary of what a “man” or “woman” might look like will, therefore, automatically be misgendered by the AI. This is what Google is attempting to avoid.
The Artificial Intelligence Principle that Google mentions in the email specifically states that Google will try to “avoid unjust impacts on people,” especially as it relates to “sensitive characteristics” like race, ethnicity, gender, political beliefs, and sexual orientation, among others.
According to Business Insider, the decision has had predictably polarizing results. A policy fellow from Mozilla that they spoke with said the move was “very positive” and agreed with Google that a person’s gender cannot be inferred by appearance, while at least one affected developer responded too the change by asserting that “political correctness has room in APIs.”
(via The Verge)
Image credits: Header photo by Mitchell Luo, API test photo by Philip Martin, both CC0
Leave a Reply