CIPHER BRIEF REPORTING — The former head of the U.S. Cybersecurity and Infrastructure Security Agency (CISA) could scarcely be more clear in his evaluations of America’s upcoming 2024 elections: They are not only awash with threats, said Chris Krebs during a recent Cyber Initiatives Group Summit, but they are also likely to wield new and distinct characteristics that set them apart from past elections.
“This is going to be about as sporty of an election from a threats perspective as we've seen,” Krebs noted. "Even more so than 2016 and 2020.”
And yet, he posited, “it's going to burn really hot, but fast ... almost like Ebola.” Likening the spread of disinformation to a pandemic, Krebs described coming threats as dissimilar from “COVID, which burns longer and broadly." "This," he added, "will burn out really fast ... and not make the jump from one ideologically aligned community to the next.”
Still, an approaching hydra of cyber menaces, expectedly fueled by artificial intelligence, include not only critical infrastructure strikes, he added, but also disinformation campaigns intent on sowing discord and undermining confidence in the nation's electoral process.
“The technologies have advanced enough that you can see totally manipulated messages dropped into a society," noted Sue Gordon, former Principal Deputy Director of National Intelligence. The danger therefore becomes a public that may find itself in a circumstance in which it almost simultaneously "trusts nothing and trusts everything."
“That lack of discrimination creates fertile ground,” she added.
Adding to those concerns is a coming to change to certification guidelines of voting machines, about which election officials have already expressed their concerns. In a March letter from the National Association of State Election Directors, the group expressed “serious concerns that false information will mischaracterize the consequences” of the changes, set to take effect on November 15, roughly one year ahead of the election. “All their public communications must be unambiguous.”
Those changes are meant to both increase accessibility and strengthen cybersecurity of the machines and systems themselves, having been formally agreed upon by Federal Election Assistance Commission officials back in 2021. And yet, according to a report unsealed last month as part of a controversial court case in Georgia over identified weaknesses in its Dominion Voting machines software, officials said the systems there will nonetheless delay upgrades until after 2024. CISA, last year, published an advisory that urged officials to address those the risks “as soon as possible,” while a redacted court-provided version of a ImageCast X voting machines report recently offered a particularly damning account of the integrity of those systems. “No grand conspiracies would be necessary to commit large-scale fraud, but rather only moderate technical skills of the kind that attackers who are likely to target Georgia’s elections already possess,” the report noted.
Meanwhile, as concerns over voting machines mount, CISA’s lead on election security is set to receive new leadership.
Kim Wyman, who was selected for the role in October 2021 after nearly a decade as Washington’s secretary of state, is scheduled to step down at the end of July. Cait Conley, a senior advisor to the CISA director, who previously served as executive director of the Defending Digital Democracy Project at Harvard University’s Belfer Center, has been tapped to take her place.
The task is considerable.
Looking for a way to get ahead of the week in cyber and tech? Sign up for the Cyber Initiatives Group Sunday newsletter to quickly get up to speed on the biggest cyber and tech headlines and be ready for the week ahead. Sign up today.
In an interview at the Aspen Institute, CISA director Jen Easterly, said ransomware attacks on election site infrastructure were among the agency's top concerns, but they also include physical security, insider threats, and threats of foreign influence and disinformation.
Still, the growing and broader emphasis on generative artificial intelligence looms large, with the tech's influence made more clear recently with a series of events, including the seemingly falsified images of former President Donald Trump and former Chief Medical Advisor Antony Fauci, locked in an embrace, to phony imagery of the aftermath of a fabricated attack against the Pentagon.
A.J. Nash, a cybersecurity intelligence executive at the ZeroFox firm, said he worries of an avalanche of malign digital content, including deep fakes and audio falsification, powered by the emerging tech. “We’re not prepared for this," he said. "To me, the big leap forward is the audio and video capabilities that have emerged. When you can do that on a large scale, and distribute it on social platforms, it’s going to have a major impact.”
As Krebs noted, that pivotal change centers on something called “synthetic media,” or media that has been at least partially engineered using artificial intelligence — a phenomenon in which algorithm-produced media can take on new qualities outside the confines of developers' initial frameworks. “We have very little control over how AI is being deployed on the web and how people are building products,” Maggie Appleton, a product designer at the AI research lab Ought, told NBC. Those systems, she and others note, can be produced quickly and with relatively little effort and cost, while including a wide cast of characters, from legitimate campaign workers to more mainstream threat actors.
All of it constitutes a growing task list for CISA ahead of Election Day.
In fact, when U.S. election systems were classified as “critical infrastructure” in 2017, Easterly told podcaster Kara Swisher, CISA took on a gamut of new responsibilities. “Part of CISA does things like the Office for Bombing Prevention, school safety, chemical security," she noted. "These are not the entities that make the headlines, but they're critical, particularly to our stakeholders around the country and in particular to election officials.”
Easterly recalled that CISA worked with these officials in the run-up to last fall’s mid-term elections, conducting physical security assessments of polling places and offering training on how to de-escalate “potentially inflammatory situations of people coming into a polling place or an election site.”
With election campaigns now unfolding in earnest, Easterly said CISA has not identified specific threats for 2024, but noted, “quite frankly, I think we will.” In January, she met with secretaries of state and state election directors, emphasizing the need for “our field forces [to go] out there to do cybersecurity assessments, to do physical security assessments, and to ensure that resources are available” — but in particular, she noted, were China’s emerging capabilities in the realms of disinformation. Chinese doctrine, Easterly added, has an intentional focus on “cognitive domain operations” through which Beijing seeks to influence the American public, which can “make things even more complicated.” But, she continued, “I want to emphasize the reason that we focus on foreign influence disinformation is because we hear from state and local election officials [that] that it is a major concern of theirs.”
It’s not just for the President anymore. Are you getting your daily national security briefing? Subscriber+Members have exclusive access to the Open Source Collection Daily Brief, keeping you up to date on global events impacting national security. It pays to be a Subscriber+Member.
Calls to address the phenomenon from a regulatory perspective, however, are growing, with Biden administration officials tackling an array associated issues, including directing federal agencies to root out bias in artificial technologies.
And yet complicating security efforts currently, experts note, is the lack of more comprehensive and agreed upon guardrails. That deficiency, at least in part, prompted the Brennan Center for Justice to propose a menu of potential actions to improve-accuracy detection and fortify against malicious phishing tools, especially during election season.
“Elections are particularly vulnerable to AI-driven disinformation,” the center noted in its analysis.
Meanwhile, the count-down clock to Election Day is ticking.
The Cipher Brief's Cyber Editor Ken Hughes contributed to this report.
Read more expert-driven national security insights, perspective and analysis in The Cipher Brief