Election Workers Are Drowning in Records Requests. AI Chatbots Could Make It Worse - 3 minutes read
Many US election deniers have spent the past three years inundating local election officials with paperwork and filing thousands of Freedom of Information Act requests in order to surface supposed instances of fraud. “I've had election officials telling me that in an office where there's one or two workers, they literally were satisfying public records requests from 9 to 5 every day, and then it's 5 o'clock and they would shift to their normal election duties,” says Tammy Patrick, CEO of the National Association of Election Officials. “And that's untenable.”
In Washington state, elections officials were receiving so many FOIA requests following the 2020 presidential elections about the state’s voter registration database that the legislature had to change the law, rerouting these requests to the Secretary of State’s office to relieve the burden on local elections workers.
“Our county auditors came in and testified as to how much time having to respond to public records requests was taking,” says democratic state senator Patty Kederer, who cosponsored the legislation. “It can cost a lot of money to process those requests. And some of these smaller counties do not have the manpower to handle them. You could easily overwhelm some of our smaller counties.”
Now, experts and analysts worry that with generative AI, election deniers could mass-produce FOIA requests at an even greater rate, drowning the election workers legally obligated to reply to them in paperwork and gumming up the electoral process. In a critical election year, when elections workers are facing increasing threats and systems are more strained than ever, experts who spoke to WIRED shared concerns that governments are unprepared to defend against election deniers, and generative AI companies lack the guardrails necessary to prevent their systems from being abused by people looking to slow down election workers.
Chatbots like OpenAI’s ChatGPT and Microsoft’s Copilot can easily generate FOIA requests, even down to referencing state-level laws. This could make it easier than ever for people to flood local elections officials with requests and make it harder for them to make sure elections run well and smoothly, says Zeve Sanderson, director of New York University’s Center for Social Media and Politics.
“We know that FOIA requests have been used in bad faith previously in a number of different contexts, not just elections, and that [large language models] are really good at doing stuff like writing FOIAs,” says Sanderson. “At times, the point of the records requests themselves seem to have been that they require work to respond to. If someone is working to respond to a records request, they're not working to do other things like administering an election.”
WIRED was able to easily generate FOIA requests for a number of battleground states, specifically requesting information on voter fraud using Meta’s LLAMA 2, OpenAI’s ChatGPT, and Microsoft’s Copilot. In the FOIA created by Copilot, the generated text asks about voter fraud during the 2020 elections, even though WIRED provided only a generic prompt, and didn’t ask for anything related to 2020. The text also included the specific email and mailing addresses to which the FOIA requests could be sent.
Source: Wired
Powered by NewsAPI.org