Before posting comments on the government’s online e-rulemaking systems, the public could soon be prompted to answer a new question: “Are you a robot?”
Agency officials told lawmakers they are exploring modern technologies like CAPTCHA and reCAPTCHA (which prompt humans to prove that they are humans) and developing new approaches to reform Americans’ ability to leave digital feedback—and trust that their comments are seen and voices are heard—using federal comment platforms.
“There’s no question millions of fake comments are coming in and you guys talked a little bit today about, to keep bots from posting on your platforms, it might make sense to install CAPTCHA or other software,” Chairman Rob Portman, R-Ohio, said Thursday at a joint hearing of the Permanent Subcommittee on Investigations and the Subcommittee on Regulatory Affairs and Federal Management. “Why wouldn’t we turn to that?”
Federal oversight of the government’s comment systems is twofold. The Federal Communications Commission is the only agency to run its own independent platform, the Electronic Comment Filing System, or ECFS. Earlier this year, the Office of Management and Budget announced that the General Services Administration would serve as the managing partner of the federal eRulemaking program, assuming responsibility from the Environmental Protection Agency. In this capacity, which was effective Oct. 1, GSA manages regulations.gov and the Federal Docket Management System and provides each participating federal agency with the ability to electronically access and manage rulemaking programs and comments.
But as lawmakers pointed out, it’s becoming increasingly difficult for agencies to safeguard their federal comment systems from abuse, and many comment sections have become targets for heaps of fake and threatening messages.
“The report that Sen. [Tom] Carper and I put out today should be a wakeup call to all of us in the agencies and here in Congress,” Portman said. “It shows how broken these commenting systems have become.”
The lawmaker announced the release at the top of the hearing of a bipartisan report that he and Carper, D-Del., produced, detailing how the systems have been exploited and mishandled. Portman and Carper argue that the government has not sufficiently responded to the abuses.
Portman highlighted some of the findings, including:
- thousands of comments submitted using stolen identities without any recourse for the victims of impersonation.
- comments posted by people who are not alive, including Elvis Presley and Richard Nixon.
- comments containing the entire text of the more than 12,000-page novel, War and Peace.
- comments inciting violence against the public and federal officials.
Several lawmakers also referenced a notable rulemaking effort that put the FCC under intense scrutiny in 2017, when bots targeted the ECFS during a public comment period on net neutrality. Sen. Josh Hawley, R-Mo., said the FCC received 24 million comments. Of those comments, 8 million came from emails made by fakemailgenerator.com, 500,000 were associated with Russian email addresses and more than 2 million comments were traced back to stolen identities. Many comments were profane duplicates, stemmed from bots, and some comments also contained viruses aimed to specifically target people who wanted to genuinely weigh in on the topic.
“Even though these problems have been clear since at least 2017, the FCC has not taken steps to address them,” Portman said.
But the agencies were quick to weigh in on the measures they are taking to tackle the abuse and new verification processes that could help ensure fair and appropriate commenting.
Regarding its transition into the newly-assumed responsibility, Principal Deputy Associate Administrator of GSA’s Office of Government-Wide Policy Elizabeth Angerman said the agency’s primary focus is to ensure a continuity of service to both agency partners and the general public. To manage e-rulemaking, the agency recently established the Office of Regulation Management to create a more integrated and streamlined federal rulemaking program using modernized technology.
“GSA’s overarching vision for rulemaking modernization is threefold,” Angerman said.
First, the agency aims to integrate data and information technology between the program and other systems to support data analytics. GSA also wants to apply innovative technology solutions to enable public access, accountability and transparency. And the agency hopes to reduce duplication while offering a quality shared service on a standardized and modernized technology platform. When prompted by lawmakers, Angerman added that the agency is considering implementing CAPTCHA technology to safeguard comment systems in the future.
“GSA’s currently looking at all of the options available to us to address some of these issues,” she said.
Ashley Boizelle, who serves as deputy general counsel at the FCC said the agency’s IT staff is working to implement various changes to boost the system’s functionality and security. They’ve also launched a cross-bureau working group to lead a review aimed at revamping the ECFS system from the ground up. They’re also convening round tables with external stakeholders to ensure that the system is secure and resilient going forward.
“We are not currently using CAPTCHA, but it is under consideration by our working group as a means of distinguishing human filers from bots,” Boizelle said.
Toward the end of the hearing, Acting Director of OMB’s Office of Information and Regulatory Affairs Dominic Mancini also emphasized that one of the biggest takeaways that he recognized during the discussion was the need to have a serious conversation with agencies and “technical folks” about exploring the use CAPTCHA and other similar technology.
“There are some tradeoffs in that technology, but it is one area in which I think we really need to get to the bottom of whether we can provide technologies that can stop the non-human interaction with it,” Mancini said. “So I’ll definitely go forward on that.”