Last week, Microsoft responded with a firm “no” when a law enforcement agency in California requested for its latest facial recognition technology to be installed in police officers’ body cameras and vehicles. President Brad Smith confirmed that the decision to not sell the Microsoft Face API to the agency was aligned with the company’s six ethical principles for developing and deploying facial recognition technology.
Microsoft’s Face API is a cognitive service provided by Microsoft wherein algorithms detect, recognise, and analyse human faces in images. Face detection, face verification, and face grouping — grouping faces together based on their visual similarity — are some key capabilities of the facial recognition service. This amazing technology is key to certain industries, including robotics, natural user interface, security, image content analysis and management, and mobile apps.
Microsoft has recently made some big decisions as to whom they will provide their facial recognition API technology to. At a Stanford University conference on “human-centered artificial intelligence,” Smith spoke of a deal that Microsoft rejected — to install facial recognition on cameras that would cover the capital city of a country, which he declined to name. Freedom House, a non-profit organisation on democracy, human rights and political freedom, noted that the cameras would have prohibited freedom of assembly there.
The California law enforcement agency planned to use the face scan to determine possible matches between people who got apprehended and a database of suspects, but their request for the technology was rejected for human rights concerns that could result in gender and racial discrimination. According to Microsoft, the artificial intelligence in the software was trained mainly on white and male photos, which would therefore lead to a bias versus women and people of ethnic minorities.
On the flip side, an American prison was granted the facial recognition technology after Microsoft determined that it would raise safety standards and would work well given the institution’s environment.
Smith reiterated that Microsoft’s decisions were based on a strong commitment to human rights in a day and age of rapid technological advances that could endanger, and violate personal privacy, conduct blanket surveillance, deploy autonomous weapons, and other actions, the effects of which may be destructive, and irrevocable.
In Dec 2018, Microsoft released an official blog post detailing the six principles to guide their facial recognition work, which we have listed verbatim here:
- Fairness. We will work to develop and deploy facial recognition technology in a manner that strives to treat all people fairly.
- Transparency. We will document and clearly communicate the capabilities and limitations of facial recognition technology.
- Accountability. We will encourage and help our customers to deploy facial recognition technology in a manner that ensures an appropriate level of human control for uses that may affect people in consequential ways.
- Non-discrimination. We will prohibit in our terms of service the use of facial recognition technology to engage in unlawful discrimination.
- Notice and consent. We will encourage private sector customers to provide notice and secure consent for the deployment of facial recognition technology.
- Lawful surveillance. We will advocate for safeguards for people’s democratic freedoms in law enforcement surveillance scenarios and will not deploy facial recognition technology in scenarios that we believe will put these freedoms at risk.
These principles have been in effect since last month and will continue to guide how Microsoft develops and deploys facial recognition technology services.
Michelle Bachelet, the United Nations High Commissioner for Human Rights who also spoke at the conference in Stanford University, sent a strong message to tech companies: “Please embody the human rights approach when you are developing technology.” /TISGFollow us on Social Media
Send in your scoops to firstname.lastname@example.org