Microsoft is removing some features from its AI service to ensure that its facial recognition technology meets ethical AI guidelines. Microsoft plans to curtail certain facial-recognition capabilities for new users starting this week and for existing users within a year.
Starting today, June 21, new customers will need to apply for access to use facial recognition operations in the Azure Face application programming interface, Computer Vision and Video Indexer, according to a new blog post, and existing customers have a year to apply and get approval for continued access to these services based on their use cases. Microsoft officials said they believe limiting access will add an additional layer of scrutiny for its facial recognition services that will ally with the company's Responsible AI Standard. As of June 30, 2023, existing customers won't be able to access facial recognition capabilities if their application has not been approved.
There are some facial recognition technologies, such as detecting blur, exposure, glasses, head pose, landmarks, noise, occlusion, and facial bounding that will not require an application.
Microsoft officials also said they are getting rid of facial analysis capabilities designed to infer emotional states and attributes like gender, age, smile, facial hair, and makeup. They said these kinds of capabilities raise questions around privacy, lack of agreed-upon definitions, and an inability to generalize linkages. These kinds of features could be more easily misused and create stereotyping, discriminatory issues, and unfair denial of services, officials said. These capabilities are no longer available to new customers starting June 21 this year and won't be available to existing users after June 30, 2023. Microsoft will continue to allow some of these technologies to be available via integration into certain services for people with disabilities like its Seeing AI app.
Microsoft is advising customers to use tools and resources like the open-source Fairlearn package and Microsoft's Fairness Dashboard to understand how Azure Cognitive Services can be best used fairly. Microsoft also has a new Recognition Quality programming interface to flag potential problems with lighting, blur, and other image-quality issues which may affect certain demographic groups more than others, officials said.