Tech, civil liberties leaders fight FBI biometric program
Silicon Valley companies are banding together with civil liberties and immigrant rights groups over concerns about the FBI's biometric database, which could be exempted from laws that allow civilians to review the records.
The Justice Department's plan to exempt the FBI's high-tech biometric database from public review isn't sitting well with Silicon Valley or privacy advocates.
On Friday, a coalition of tech companies, civil liberties, and immigrant rights groups sent an open letter to the Department of Justice questioning agency efforts to exempt Next Generation Identification (NGI) – an FBI database used to match fingerprints and other biometric data with criminal records – from parts of a key law that gives people the right to review and correct government records concerning them.
The letter from advocacy groups such as the American Civil Liberties Union and companies such as Uber calls on DOJ to extend the 30-day comment period on the proposal, which ends June 6, by another 30 days.
"The public has the right to know about the FBI system that employs the most advanced surveillance technologies and runs a database holding records on millions of Americans, many of whom have never been accused of a crime," said Patrice McDermott, executive director of the advocacy group OpenTheGovernment.org, in a statement.
"The public has a right to the protections and redress afforded by the Privacy Act," she said, referring to a 1974 law that regulates how federal agencies can collect, use and distribute personal information.
The database will replace the FBI’s current Integrated Automated Fingerprint Identification System (IAFIS), used by federal, state, and local law enforcement to match fingerprints with criminal records. The new system will include an improved fingerprint search algorithm, information on suspected terrorists, a national database of mugshots that users can search using facial recognition and a searchable registry of identifying marks like tattoos and scars.
According to the FBI's privacy notice, law enforcement will only be able to use the biometric data in NGI for “investigative leads” rather than to positively identify a suspect. The agency also says its algorithms do not consider factors such as skin color, age, or gender.
“In only approximately 12 percent of requests is a candidate sent back, and then only as an ‘investigative lead,' not positive identification,” a DOJ and FBI official said in an email. “The ‘investigative lead’ requires additional investigation to determine whether the candidate is indeed the person being sought.”
Much of NGI’s data will come from arrest records submitted by state and local law enforcement. A report issued by the Government Accountability Office last year noted that these records often lack information about whether an arrest led to a conviction.
The FBI works with local law enforcement to keep those records up to date, according to DOJ and FBI officials. However, a 2013 report by National Employment Law Project, which signed the coalition letter, found that some 50 percent of criminal background records in IAFIS don’t include information on the final disposition.
It's not just civil liberties groups making that case. In an April speech at the National Press Club, Senate Judiciary Committee chairman Chuck Grassley (R) of Iowa said inaccurate employment background checks could hinder job candidates, especially if a shelved arrest comes up in the process.
“It’s unfair that an arrest – not resulting in a conviction – is included in a criminal background check,” Senator Grassley told reporters. "There are flaws in that process that need to be looked [at] and changed.”
What's more, the FBI hasn’t tested NGI’s algorithm for racial bias, according to Alvaro Bedoya, executive director of the Center on Privacy and Technology at Georgetown Law, which signed the letter. Research published by the Institute of Electrical and Electronics Engineers in 2012 that included an FBI technologist found that facial recognition algorithms tend to have lower accuracy rates when trying to match the faces of women, youth and African Americans.
Taken together, inaccurate arrest records and facial recognition matches raise the possibility that NGI could include inherent biases against immigrants and people of color, potentially blocking people from employment who have never been convicted of a crime, according to the coalition letter.
While the FBI’s move to exempt NGI from privacy laws could deepen those concerns, it might not set a precedent. IAFIS is also exempt from key Privacy Act provisions, and the FBI has set up an alternative system for people to obtain and correct their records. That’s necessary because some provisions of the Privacy Act could allow criminals to uncover ongoing investigations or learn about government witnesses, the FBI said in its notice of the proposed exemption.
The coalition acknowledged that point in its letter. But advocates have also pointed to problems with the FBI’s current process, including low public awareness and lack of a firm timeline for correcting inaccurate information.
“[T]he alternate system has no timeline for corrections, while the Privacy Act system has a set timeline that requires disputes to be resolved in a few months,” said Mr. Bedoya of the Center on Privacy and Technology at Georgetown Law in an email.
The FBI says the current procedures that apply to IAFIS, and that will apply to NGI, give people ample access to their records. Moreover, most of the records in NGI will come from state, local and tribal law enforcement, the agency says — and people can pursue corrections with those agencies directly.
“In most instances, a person would be aware that his fingerprints and/or photo were taken either incident to arrest or in conjunction with an application process. The person may also request his criminal history for review pursuant to the regulations cited above,” said DOJ and FBI officials. “In this manner, individuals do have access and amendment rights to their information in NGI.”