Madeline Purdue

USA TODAY

Published 2:17 PM EDT Jul 1, 2019

LOS ANGELES — A major supplier of body cameras to law-enforcement agencies across the country has decided to forgo selling facial-recognition technology with its products.

Axon, which supplies 48 police departments in major cities with body cameras, made the decision after the companys ethics board concluded the technology was not accurate enough to be implemented in the field and could potentially cause major trust issues between law-enforcement officials and their communities.

In a 42-page report, the ethics board detailed concerns with the inaccuracy of the software, saying results showed it was less accurate when identifying women, younger people, and “worsens when trying to identify people of color compared to white people, a troubling disparity that would only perpetuate or exacerbate the racial inequities that cut across the criminal justice system.”

Is Facebook listening to me? Why those ads appear after you talk about things

The board is composed of experts from different professions, including law enforcement, robotics and policy, and advises Axon on the potential effects its technology could have on society.

“Some police departments are sophisticated and would understand the limitations on (facial recognition) and would put their own guardrails in place around the use of that technology by their officers. But there are some police departments that would not take those same precautions and may not appreciate the implications of using a technology thats not ready yet for prime-time,” said board member Jim Bueermann, who served as a police officer for over 30 years and is now president of the National Police Foundation.

Axon CEO Rick Smith said the company is “moving cautiously” because of these implications.

“There are companies out there actively promoting using facial recognition on body cameras,” Smith said. “From our perspective, we believe taking the time to do the ethical analysis up front in the product design process will have much better outcomes in the long haul.”

Smith says the company might reconsider incorporating the technology in their products in the future once the inaccuracies are solved, but must balance that with privacy and safety and put in “safeguards” to avoid misuse of facial recognition.

“We dont believe that the right answer is that the police should deploy face recognition with no controls and go wild with it, but similarly we dont believe that it makes sense to say police should never use face-recognition technology because theres many cases where its pretty universally known that it will be a good thing,” Smith said.

Facial recognition has been a hot-button topic around the country as concerns about privacy and accuracy have made some cities skeptical of the technology, leading San Francisco to ban it altogether, with cities such as Oakland and Berkeley considering following suit. California may enact a statewide ban if Gov. Gavin Newsom signs the Body Camera Accountability Act this summer. The Somerville City Council in Massachusetts banned facial recognition just this week.

While these cities are turning their backs to this technology, others are welcoming it. The Orlando Police Department is testing Amazons 'Rekognition' software, although not in public or for investigative use. According to the New York Times, Detroit signed a $1 million deal to set up facial recognition in the citys surveillance cameras with broad rules on how law enforcement can use the footage.

Bill Johnson, executive director of the National Association of Police Organizations, compares facial-recognition software to when law enforcement started using DNA testing as an investigative tool.

“There was a lot of concern then as well about the government is going to have my DNA, theyre going to know all about me, how is this going to be used? and I think today those fears have generally gone by the wayside,” Johnson said.

Privacy and who's tracking you: Apple, Google, Amazon and their tracks