IBM will no longer offer "general purpose IBM facial recognition or analysis software," company CEO Arvind Krishna said in a letter to Congress.

As US cities face growing protests and a brutal police crackdown, IBM said that it "firmly opposes and will not condone uses of any technology, including facial recognition technology offered by other vendors, for mass surveillance, racial profiling, violations of basic human rights and freedoms, or any purpose which is not consistent with our values and Principles of Trust and Transparency."

DCD has contacted the company for clarification on what "general purpose" means, and if it will still sell custom solutions.

An important, if vague, step

IBM logo
– Sebastian Moss

"We believe now is the time to begin a national dialogue on whether and how facial recognition technology should be employed by domestic law enforcement agencies," Krishna, who took over the company in January, said.

"Artificial Intelligence is a powerful tool that can help law enforcement keep citizens safe. But vendors and users of Al systems have a shared responsibility to ensure that Al is tested for bias, particularity when used in law enforcement, and that such bias testing is audited and reported."

Facial recognition tools are increasingly used by law enforcement groups around the world, often with mixed results. With the legal system often steeped in bias, data sets using criminal databases can reflect that bias - something that is passed on to the algorithm. A December 2019 National Institute of Standards and Technology study found “empirical evidence for the existence of a wide range of accuracy across demographic differences in the majority of the current face recognition algorithms that were evaluated."

Amazon discovered a similar problem when it tried to build an artificial intelligence system to help with recruitment: Trained using its existing hiring database, the system ended up weighting men higher than women.

The company has also faced complaints for its own facial recognition service, Rekognition. In a study, the American Civil Liberties Union used Rekognition to match photos of 28 members of Congress with publicly available mug shots. "Nearly 40 percent of Rekognition’s false matches in our test were of people of color, even though they make up only 20 percent of Congress," ACLU attorney Jacob Snow said. “People of color are already disproportionately harmed by police practices, and it’s easy to see how Rekognition could exacerbate that." The system is sold to law enforcement agencies across the US.

More alarmingly, earlier this year OneZero found that the CEO of surveillance firm Banjo once helped a KKK leader shoot up a synagogue. After the news was published, Damien Patton resigned from the company that provides services to police forces.

Similar systems are thought to be in use today, wielded against Black Lives Matter protesters across the US. Back in 2015, following the death of Freddie Gray in police custody, protests broke out in Baltimore. Police used Geofeedia, a company backed by CIA investment arm In-Q-Tel, to identify and arrest protesters with outstanding warrants.

Over in the UK, the Metropolitan Police have trialed a facial recognition system numerous times in London. Between 2016 and 2018, more than 90 percent of the potential criminals it identified were wrong.

A trial in Oxford Circus this March led to seven arrests of incorrectly identified innocent citizens.