An artificial intelligence tool used to identify people in law enforcement investigations, airport security and public housing surveillance disproportionately harms people of color and women, according to a new government watchdog report.

Facial recognition technology — which civil rights advocates and some lawmakers have criticized for privacy infringements and inaccuracy — are increasingly used among federal agencies with sparse oversight, the U.S. Commission on Civil Rights found.

“Unregulated use of facial recognition technology poses significant risks to civil rights, especially for marginalized groups who have historically borne the brunt of discriminatory practices,” Chair Rochelle Garza said. "As we work to develop AI policies, we must ensure that facial recognition technology is rigorously tested for fairness, and that any detected disparities across demographic groups are promptly addressed or suspend its use until the disparity has been addressed.”

Rapidly evolving facial recognition tools have been increasingly deployed by law enforcement, but there are no federal laws governing its use.

At least 18 federal agencies use facial recognition technology, according to the Government Accountability Office. In addition to federal deployment, the Justice Department since 2007 has awarded $4.2 million to local law enforcement agencies across the nation for programs that were used at least in part for facial recognition tools, public records show.

FBI sprawling database deploys facial recognition software

The 184-page report released this month details how federal agencies have quietly deployed facial recognition technology across the U.S. and its potential civil rights infringements. The commission specifically examined the Justice Department, Department of Homeland Security, and Department of Housing and Urban Development.

“While a robust debate exists surrounding the benefits and risks associated with the federal use of FRT, many agencies already employ the use of this technology,” the report said, adding it can have serious consequences such as wrongful arrests, unwarranted surveillance and discrimination.

A facial recognition system uses biometric software to map a person’s facial features from a photo. The system then tries to match the face to a database of images to identify someone. The degree of accuracy depends on several factors, including the quality of the algorithm and of the images being used. Even in the highest performing algorithms, the commission said tests have shown that false matches are more likely for certain groups, including older adults, women and people of color.

The U.S. Marshals Service has used facial recognition tools for investigations into fugitives, missing children, major crimes and protective security missions, the commission report said, citing the Justice Department. The Marshals has held a contract with facial recognition software company Clearview AI for several years. Some members of Congress urged against use of Clearview AI products and other facial recognition systems in February 2022 due to potential civil rights violations, including threats to privacy.

The FBI’s use of facial recognition technology dates back to at least 2011. The Justice Department told commissioners the FBI can run facial recognition software on a wide range of images, including booking photos, driver’s licenses, public social media accounts, public websites, cell phones, images from security footage and photos maintained by other law enforcement agencies.

The U.S. Government Accountability Office has been probing the FBI’s use of facial recognition technology since 2016. In its report eight years ago, the office concluded the FBI “should better ensure privacy and accuracy.”

The Justice Department, which oversees the FBI and Marshals, announced an interim policy in December 2023 on facial recognition technology that said it should only be used for leads on an investigation, the report said. The Commission added there is not enough data on the department's use of FRT to confirm whether that is practiced.

The FBI declined to comment on the report when reached by USA TODAY. The Justice Department and U.S. Marshals Service did not return a request for comment.

AI tool used in border control, immigration probes

The Department of Homeland Security, which oversees immigration enforcement and airport security, has deployed facial recognition tools across several agencies, the commission found.

U.S. Immigration and Customs Enforcement has been conducting searches using facial recognition technology since 2008, when it contracted with a biometrics defense company, L-1 Identity Solutions, according to the report.

The contract allowed ICE to access the Rhode Island Division of Motor Vehicles' face recognition database to find undocumented immigrants who were charged with or convicted of crimes, the commission wrote, citing a 2022 study from the Georgetown Law Center on Privacy & Technology.

Facial recognition technology is also used at airports, seaports, and pedestrian lanes of the southwest and northern border entry points to verify people's identity. The report noted civil rights groups in 2023 reported that the U.S. Customs and Border Protection mobile app struggled to identify Black asylum seekers seeking to schedule an appointment. CBP this year said it has an accuracy rate of over 99% for people of different ethnicities, according to the commission’s report.

Department of Homeland Security spokesperson Dana Gallagher told USA TODAY the department values the commission’s insights and said the DHS has been at the forefront of rigorous testing for bias.

The department opened a 24,000 square-foot lab in 2014 to test biometric systems, according to the report. Gallagher said the Maryland Test Facility, which the commission visited and documented, served as a “model for testing face recognition systems in real-world environments.”

“DHS is committed to protecting the privacy, civil rights, and civil liberties of all individuals we interact with in fulfillment of our mission to keep the homeland safe and the traveling public secure,” Gallagher said.

Public housing agencies deploy facial recognition tools

Some surveillance cameras in public housing contain facial recognition technology that has led to evictions over minor violations, the commission said, which lawmakers have raised concerns about since at least 2019.

The U.S. Department of Housing and Urban Development hasn't developed any of the technology itself, the report said, but it has issued grants to public housing agencies that used it to purchase cameras with the technology, subsequently “putting FRT in the hands of grantees with no regulation or oversight.”

Public housing tenants are disproportionately women and people of color, which means the technology use could amount to Title VI violations, the commission warned. In April 2023, HUD announced Emergency Safety and Security Grants could not be used to purchase the technology, but the report noted it didn't restrict recipients who already had the tool from using it.

The commission cited a May 2023 Washington Post investigation which found the cameras have been used to punish residents and catch them in minor violations to pursue evictions, such as smoking in the wrong area or removing a cart from a laundry room. Attorneys defending evicted tenants also reported an uptick in cases that cited surveillance footage as evidence to kick people out, the Post reported.

The Department of Housing and Urban Development didn't return USA TODAY's request for comment.

Civil rights group hopes report spurs policy changes

Tierra Bradford, senior program manager for justice reform at the Leadership Conference on Civil and Human Rights, told USA TODAY she was excited to see the report and is hoping it will lead to further action.

“I think that they’re lifting up a lot of concerns that us in the justice space have had for a while,” Bradford said.

The U.S. criminal justice system has a history of disproportionately targeting marginalized communities, she added, and facial recognition tools appeared to be another iteration of that problem.

“There should be moratoriums on technology that's shown to be really biased and have a disparate impact on communities.”

National debate over facial recognition tools

The commission's report comes after years of debate over use of facial recognition tools in the public and private sector.

The Detroit Police Department in June announced it would revise its policies on how it uses the technology to solve crimes as part of a federal settlement with a Black man who was wrongfully arrested for theft in 2020 based on facial recognition software.

The Federal Trade Commission last year banned Rite Aid from using AI facial recognition technology after finding it subjected customers, especially people of color and women, to unwarranted searches. The FTC said the system based its alerts on low-quality images, resulting in thousands of false matches, and customers were searched or kicked out of stores for crimes they did not commit.

In Texas, a man wrongfully arrested and jailed for nearly two weeks filed a lawsuit in January that blamed facial recognition software for misidentifying him as the suspect in a store robbery. Using low-quality surveillance footage of the crime, artificial intelligence software at a Sunglass Hut in Houston falsely identified Harvey Murphy Jr. as a suspect, which led to a warrant for his arrest, according to the lawsuit.

On a national level, members of the Commission on Civil Rights said they hope the report will inform lawmakers about the use of the rapidly evolving technology. The agency is pushing for a testing protocol that agencies can use to check how effective, equitable and accurate their software is. It also recommends that Congress provide a “statutory mechanism for legal redress” for people harmed by FRT.

"It is my hope that this bipartisan report will help inform public policy that will address the myriad of issues concerning artificial intelligence (AI) in general, but as it relates to this issue, facial recognition technology specifically,” Commissioner Stephen Gilchrist said. “Our country has a moral and legal obligation to ensure that the civil rights and civil liberties of all Americans are protected."

Disclaimer: The copyright of this article belongs to the original author. Reposting this article is solely for the purpose of information dissemination and does not constitute any investment advice. If there is any infringement, please contact us immediately. We will make corrections or deletions as necessary. Thank you.