House lawmakers—and, according to those representatives, their constituents—are wary of facial recognition and are working on legislation that would halt its progress as Congress and federal regulators get their arms around how the technology is being used now and put guardrails in place for its use in the future.
Members of the House Committee on Oversight and Reform held a hearing Wednesday, the third in a series about facial recognition technology. At this latest hearing, lawmakers heard from private-sector experts in biometric technology, artificial intelligence and the security and surveillance industry about what key elements future legislation should include.
“It is clear that despite the private sector’s expanded use of technology, it’s just not ready for primetime. Despite these concerns, we see facial recognition technology being used more and more in our everyday lives,” said Oversight Chair Carolyn Maloney, D-N.Y. “Our committee is committed to introducing and marking up common-sense facial recognition legislation in the very near future. Our hope is that we can do that in a truly bipartisan way.”
A congressional staffer confirmed to Nextgov that the legislation is currently in the works and is a priority for the oversight committee but did not have details to share at this time. While exact language was not available, lawmakers and witnesses, alike, said legislation should look to “pause” the expansion of facial recognition technology until the kinks can be worked out.
Meredith Whittaker, co-founder and co-director of the AI Now Institute at New York University, agreed with the assessment that the technology is proliferating and noted misidentifications and other issues are getting better as market forces push companies to improve.
“What we have not seen are … new mechanisms for real consent, not just notice equals consent,” she said, highlighting the difference between signs or other notices that let people know they are being surveilled and affirmative consent, in which people actively give permission for facial recognition technology to be used. “I think we need to pause the technology and let the rest of it catch up so that we don’t allow corporate interests and corporate technology to race ahead and be built into our core infrastructure without having put the safeguards in place.”
Oversight Ranking Member Jim Jordan, R-Ohio, agreed, offering a glimpse into the pending legislation, which he said committee members have been working on for more than a year. He said the bipartisan bill should begin with an assessment of how federal agencies are using the technology today.
“We just want to know which agencies are using this, how they’re using it—to what extent is it happening. We just don’t know that—to what extent is the FBI using it, the IRS, any other agency,” he said. “Second, while we’re trying to figure that out … let’s not expand it. Let’s just start there: Tell us what you’re doing and don’t do anything while we’re trying to figure out what you’re doing.”
“Facial recognition technology has benefits, to be sure. But we should not rush to deploy it until we understand the potential risks and mitigate them,” Maloney said.
Lawmakers from both sides of the aisle voiced agreement on the need for legislation, and all were wary of how the technology will evolve. For Rep. Mark Meadows, R-N.C., faulty facial recognition technology is far less scary than working technology put to malicious use.
“This is where Conservatives and Progressives come together,” he said. “On defending our civil liberties; on defending our Fourth Amendment rights, and it is that right to privacy.”
Meadows pointed out that several members of the committee, including Rep. Jimmy Gomez, D-Calif., were misidentified during a test of the technology on members of Congress.
“If we only focus on the fact that they’re not getting it right with facial recognition, we’ve missed the whole argument because technology is moving at warp speed,” Meadows said. “My concern is not that they improperly identified Mr. Gomez. My concern is that they will properly identify Mr. Gomez and use it in the wrong manner.”
Daniel Castro, vice president and director of the Center for Data Innovation at the Information Technology and Innovation Foundation, offered seven specific issues Congress should address in legislation, including basic data rights. However, he disagreed with fellow panelists when it came to global opt-in requirements.
“While it may be appropriate to require opt-in consent for certain sensitive uses—such as in health care or education—it won’t always be feasible,” he said. “For example, you probably can’t get sex offenders to agree to enroll in it. So, opt-in should not be required across the board.”
Castro also argued against establishing laws that would allow consumers to bring legal action against facial recognition companies, as “that would significantly raise costs for businesses, and these costs would eventually be passed onto consumers.”
Castro said a bill should include additional directives to the National Institute of Standards and Technology to expand its testing programs and call on the General Services Administration to develop performance standards for any technologies being purchased by federal agencies. He also suggested any legislation should include a requirement for law enforcement to obtain a warrant before using any facial recognition technologies in the course of investigations or surveillance.
“Congress should consider legislation to establish a warrant requirement for authorities who track people’s movements, including when they use geolocation data with facial recognition systems,” he said.
While ensuring an added level of oversight, this requirement would also add transparency to the process by creating a paper trail whenever the technology is used to surveil someone.