As the Kremlin wages an unyielding disinformation campaign against the United States and its European allies, Washington is still reeling from Moscow’s interference in the 2016 election and only just beginning efforts aimed at tackling this major national security threat.
Almost a year after election day, there are some signs of life in U.S.-led efforts to counter disinformation. While allies across the Atlantic have long faced off against this Russian weapon, the 2016 presidential election — filled with revelations about the Democratic National Committee (DNC) hack, use of Facebook advertisements, and Russian troll armies on social media — opened the eyes of many in Washington to Russian President Vladimir Putin’s preferred tactics.
“When we look at the U.S. election, the size and the risk appetite was probably something that was of surprise — but as far as the methods and interest to actually attempt to sway the opinions, that is nothing new, at least in Europe,” Jānis Sārts, director at the NATO Strategic Communications Centre of Excellence, told The Cipher Brief.
By using “fake news,” social media manipulation and amplification through bots, targeted advertising, weaponized stolen information, and other information warfare tactics, Russia has sought to sow discord, exploit societal divisions, and make an impact on the U.S. political scene.
A U.S. intelligence community assessment found that Putin himself ordered a cyber-enabled influence campaign aimed at interfering in the United States election and boosting then-candidate Donald Trump’s presidential chances.
“I suspect that we will not see any let up from the Russians. They have not suffered for their aggressive efforts to influence U.S. politics,” said John Sipher, who retired in 2014 after a 28-year career in the CIA’s National Clandestine Service.
The Kremlin-backed activities connected to the U.S. election and dezinformatsiya — Russian for disinformation, or the spreading of false or misleading information to sow discord and undermine faith in institutions of power — are currently under investigation in the U.S. by a special counsel and several congressional committees.
But Moscow’s influence efforts have extended far beyond the 2016 election, experts and officials warn.
This year, France, Germany, and the Netherlands all saw attempts by Kremlin-linked actors to influence their elections, while Russia also tried its hand — poorly, according to U.S. Army Europe commander Lieutenant General Ben Hodges — at influence operations tied to military exercises like Zapad 17.
Along with media originating from Russian state-connected outlets like RT and Sputnik, the Kremlin uses cyber capabilities including bots and human sources to amplify influence themes. Notably, Russia has tested anti-immigrant narratives with the aim of driving a wedge among European Union members and within the states themselves.
As for the U.S. information space, Moscow never left. Kremlin-associated users, trolls, and bots continue to be active on social media platforms targeting users in the U.S., stirring the pot on issues ranging from the white nationalist violence in Charlottesville, Va., to NFL athletes taking a knee during the national anthem.
“‘Fake news’ is the fixation of the day, but there is a sort of misconception that this is the only thing there is. Fake news is just one element in a much wider picture,” Sārts noted.
Some of the techniques on display are ripped right from the KGB handbook of the ‘70s and ‘80s — placing false stories or disseminating faked documents were common Soviet tactics — but many of the methods that the Kremlin has wielded most successfully today stem from Western advances in technology, advertising, and media.
“The most effective ones come from the marketing practices that our companies use. You look at the technology, that’s also our technology. Both things actually emanate from our systems, but we’ve not been able to put them together — somebody else has,” Sārts said.
“People sometimes forget that the message with regard to Russian propaganda is not the most important thing,” added Steve Hall, a former senior CIA officer who retired in 2015 and spent much of his career overseeing intelligence operations in the countries of the former Soviet Union and the former Warsaw Pact. “It’s actually just the disruption of fact. And in that, they’ve been extremely successful.”
The U.S. has been slow to take on this challenge, current and former government officials told The Cipher Brief.
"People keep saying that social media companies need to do more — let’s get the U.S. government to do more first before you start beating up industry. Because Facebook has a lot more people and money tied up into policing their accounts than the U.S. government has put against this effort,” a former official in the Obama administration told The Cipher Brief. The official spoke on condition of anonymity to discuss the U.S. government’s response.
The State Department’s Global Engagement Center mandate expanded this year, adding a counter-state mission to its initial aim of terrorist counter-messaging. The GEC, which is tasked with countering foreign state and non-state propaganda and disinformation that targets the U.S. and its interests, has been tasked by Congress to serve as the lead across the U.S. government.
“Congress put us in the lead. The staff is solely focused on this, and then you enlighten and empower others. Some of it will go wrong, that's inevitable — but this will be the central task of this office. Building a new structure for a new reality is always a challenge in a bureaucracy,” a State Department official told The Cipher Brief. The official was authorized to speak but did not agree to be named.
Beyond the GEC’s State Department budget, Congress allocated a significant chunk of funding from the Department of Defense to be available to the GEC this year, under the National Defense Authorization Act for Fiscal Year 2017.
Only in late summer did Secretary of State Rex Tillerson sign off on a request for that funding, asking for $40 million of the $60 million available to the center from the Pentagon. The Department of Defense is currently reviewing the request, and the center hopes it will go through this year.
"Communications is contentious stuff, and countering propaganda is doubly contentious,” the State Department official said. “Clearly, we've not cracked the code on this. We're in a stage right now working to build up plans and identify expertise. We're gaming out what types of expertise we need in a resource-constrained environment.”
The center will also have another funding source to tap into thanks to the $250 million for fiscal years 2018 and 2019 appropriated for the new “Countering Russian Influence Fund” in the sanctions legislation Congress passed in August. That money could be used not only for countering disinformation, but also combatting corruption, protecting crucial infrastructure, and providing assistance to members of the EU and NATO.
While a lack of funding may have been the problem before, now the GEC faces the issue of a huge pile of money without the infrastructure and capacity to handle it. The counter-ISIS and terrorist mission is well-established, with the center coordinating across the government, building partnerships with NGOs, and testing messaging experiments, but the counter-state effort is still in the very early stages.
“A lot of what we're doing initially is trying to answer questions like, should we put out counter-narratives?” the State Department official said in the sanctioned interview. “There seems to be more of an appetite for inoculation, education, and awareness raising than tit-for-tat responses. It seems that once a narrative has been set, it's really hard to dislodge it. So we may want to instead focus on exposing the mechanisms.”
There are currently around 70 employees at the center, and that will likely be boosted as the center deals with its growing mission to combat disinformation and propaganda from state actors like Russia, China, Iran, and others.
“One of our eventual jobs will be to really drill down on the actual impact of all of this. The challenge is identifying where it really impacts our national security. We don't have established standards right now,” the State official noted.
A key part of the center’s future work will also be building a network of NGOs, think tanks, and experts in frontline states like the Baltics and Ukraine — much like it has with groups that counter violent extremism — and providing grant money to their initiatives.
As for practical advice to the U.S., experts say both the technical and narrative strands of disinformation must be considered.
According to Sārts, working on ways to block bot systems from amplifying “fake news” would be useful, but most resources should be dedicated to being “proactive on our own goals, narratives, ideas, values, which I think is more effective than spending the resources in countering.”
The U.S. should leverage its intelligence partnerships with allies to ground its efforts going forward, said Daniel Hoffman, a former CIA chief of station. “For the U.S., first it would be important to engage with our intelligence liaison partners to track what they are learning from Russian efforts to mount covert influence cyber operations including best cyber counter measure practices,” he said.
At the GEC, what will be the best counter-disinformation strategy for the U.S. is still under debate. But it is clear the U.S. has to step up its game as the Kremlin — and other adversaries — seek to exploit divisions within society through influence and disinformation operations.
"We're only just beginning to think critically of systems, algorithms, and people who are serving us all of this propaganda and disinformation. But we're not going to have the luxury of just studying this,” the State Department official said. “It’s like the 1940s and 1950s institution building of the U.S. national security apparatus. This office is one of those experiments.”
Mackenzie Weinger is a national security reporter at The Cipher Brief. Follow her on Twitter @mweinger.