Google, for long, had deployed an AI tool that recognises and labels images accordingly. However, as per Business Insider, Google notified developers that it will now tag men and women as a person instead of male and female. Getting rid of gender label is taken by the company as determining the gender just by analysing looks in photos is not easy. Consequently, instead of tagging the wrong gender, labelling it as people is a viable option.
On numerous occasions, computer vision technology has miss identified women with a man and vice a versa. Thus, considering the track record of facial recognition technology, Google has made the much need change. Such flaws have already in the past put Google in the soup. In 2015, a software engineer said that Google Photos’ image recognition model identified his black friends as gorillas. Although Google changed the algorithm and blocked the model from recognising gorillas, it had been using the technology to tag people.
Sign up for your weekly dose of what's up in emerging technology.
Recently, Google has double down on its AI practices and trying every possible way to ensure they do not ship bias with ML models. Biase in AI is already one of the widely discussed topics, and taking any risk with bias has only negatively impacted the organisations.
However, the AI tool will continue to function as it has been doing for years. It will keep identifying the other things in the photo and tag them accordingly. “ A tech policy fellow at Mozilla with expertise on AI bias said that the update was a very positive one.”
Google of late has been voicing to regulate AI such that organisations can leverage the framework to build products on top of it. This move to stop labelling gerners is said to be in line with what Google and its top management have been voicing.