Skip to main content

Photo Credit: Anton Watman / Shutterstock

The FBI uses facial recognition without complying with privacy laws.

By Tana Ganeva

In 2010, the FBI launched Next Generation Identification, a sprawling, complex program designed to use biometric tools like facial recognition, finger and palm prints, and iris scans in criminal investigations. At the time, privacy advocates worried that the FBI would collect and use the data without adequate oversight or privacy protections, especially given the rapid advances in facial recognition technology.

Last week, a House Committee on Oversight and Government Reform found that privacy experts were right to be concerned: the FBI uses facial recognition without complying with privacy laws; 1 out of every 2 Americans’ photo is in some kind of FRT database; and facial recognition technology can reproduce race and gender bias, “misidentifying female and African American individuals at a higher rate.”

Jennifer Lynch, staff attorney at the Electronic Frontier Foundation, testified about all the ways that police can use—and misuse—facial recognition.

“Law enforcement officers can use mobile devices to capture face recognition-ready photographs of people they stop on the street; surveillance cameras boast real-time face scanning and identification capabilities; and the FBI has access to hundreds of millions of face recognition images of law-abiding Americans,” Lynch testified. “This has led to the development of unproven, inaccurate systems that will impinge on constitutional rights and disproportionately impact people of color.”

“This has real-world impact; an inaccurate system will implicate people for crimes they didn’t commit, forcing them to try to prove their innocence and shifting the traditional burden of proof away from the government,” Lynch testified. “Face recognition misidentifies African Americans and ethnic minorities, young people, and women at higher rates than whites, older people, and men, respectively.”

Research suggests that several algorithms used in FRT searches are more likely to give the wrong result when the suspect is black.

“If the suspect is African American rather than Caucasian, the system is more likely to erroneously fail to identify the right person, potentially causing innocent people to be bumped up the list—and possibly even investigated,” according to a statement by Alvaro Bedoya, head of Privacy & Technology at Georgetown Law.

“Perversely, due to disproportionately higher arrest rates among African Americans, face recognition may be least accurate for those it is most likely to affect: African Americans,” Bedoya said.


Tana Ganeva is a reporter covering criminal justice, drug policy and homelessness. Follow her on Twitter @TanaGaneva.

IBW21

IBW21 (The Institute of the Black World 21st Century) is committed to enhancing the capacity of Black communities in the U.S. and globally to achieve cultural, social, economic and political equality and an enhanced quality of life for all marginalized people.