Microsoft is pulling the plug on several AI-powered facial analysis tools, including one that it said could identify a subject’s emotion from pictures and videos. Experts have heavily criticised such “emotion recognition” tools, calling it unscientific to equate external emotional displays with internal feelings. They say facial expressions thought to be universal actually differed across populations.
The decision to restrict public access to these tools forms part of a larger overhaul of the company’s policies on AI ethics. Microsoft’s updated Responsible AI Standards emphasise accountability to find out who uses the services and greater human oversight into where the tools are applied.
In practical terms, Microsoft will limit access to some facial recognition services (Azure Face) features, while removing others entirely. Users who want to use Azure Face for facial identification will have to apply, telling Microsoft how and where they would deploy the systems. Some use cases with less harmful potential will remain open-access.
Apart from removing public access to the emotion recognition tool, the Washington-based tech giant will also retire Azure Face’s ability to identify age, gender, facial hair, hair, smile, and makeup.
Announcing the news, Microsoft’s Chief Responsible AI Officer Natasha Crampton wrote in a blog post: “Experts inside and outside the company have highlighted the lack of scientific consensus on the definition of ‘emotions,’ the challenges in how inferences generalize across use cases, regions, and demographics, and the heightened privacy concerns around this type of capability.”
Microsoft said it would stop offering the features to new customers from June 21, while existing customers would have their access revoked on June 30, 2023.
However, Microsoft will continue using the feature in at least one product — the Seeing AI app that uses machine vision to describe the world for visually impaired people.
Sarah Bird, Microsoft’s Principal Group Product Manager for Azure AI, said in a blog post that emotion recognition tools “can be valuable when used for a set of controlled accessibility scenarios.”
Microsoft will also introduce similar restrictions to the Custom Neural Voice feature, which allows customers to create AI voices based on recordings of real people.