As more details emerge about Kremlin influence and disinformation operations aimed at sowing discord and exploiting divisions in the United States, congressional investigators are taking a closer look at the role top technology companies played in the 2016 election.
The House and Senate Intelligence Committees are meeting in closed briefings with representatives from Facebook, Twitter, and Google to learn more about Russian efforts on social media platforms. Experts told The Cipher Brief that Kremlin-backed disinformation and influence operations go far beyond any one election, targeting vulnerabilities and taking advantage of the U.S.’s free and open cyberspace to apply influence and play on weaknesses in the information sphere.
Daniel Hoffman, a former CIA chief of station, said these Russian operations tap into vulnerabilities, technical and otherwise, to manipulate and amplify divisive themes.
"President Putin holds a black belt in judo, a key principle of which is to use an opponent’s strength against them. Our core strength as a country derives from the First Amendment, freedom of the press, liberty, and our democratic institutions. We are inherently vulnerable to cyber influence operations including disinformation which can gain traction in our free and open cyberspace,” Hoffman, a Cipher Brief expert, said.
Hoffman, whose assignments included a tour of duty in the former Soviet Union, pointed out that there is a long history of disinformation campaigns waged from Moscow.
“The Soviets used to direct their sources to write covert influence newspaper articles, but in today’s interconnected cyber space, the Kremlin can exploit the internet’s instantaneous, asymmetric force multiplier to its benefit,” he said.
Facebook, Twitter, and Google have also been asked to publicly testify before Senate Intel on November 1 to face questions about Kremlin-backed efforts on their platforms, while the House committee expects to hear from the companies in October.
"In the coming month, we will hold an open hearing with representatives from tech companies in order to better understand how Russia used online tools and platforms to sow discord in and influence our election,” Reps. Mike Conaway (R-TX) and Adam Schiff (D-CA), who are leading the committee’s investigation on Russian interference in the 2016 election, said in a joint statement.
Earlier this month, for instance, Facebook said that $100,000 was spent on about 3,000 ads over a two-year period from June 2015 to May of 2017, connected to about 470 inauthentic accounts likely operated out of Russia.
After meeting with congressional investigators on Thursday, Twitter released some initial findings of its internal investigation, such as identifying about 200 accounts believed associated with the Russian-linked accounts that Facebook identified as purchasing ads to stoke tensions during the 2016 election and revealing that Russian state-owned RT spent $274,100 in U.S. ads in 2016. Sen. Mark Warner, the vice chairman of the Senate Select Committee on Intelligence, called Twitter’s presentation “deeply disappointing” and said it showed an “enormous lack of understanding” about the seriousness of the issue.
John Sipher, who retired in 2014 after a 28-year career in the CIA’s National Clandestine Service, said that “the Russians have been experts at deception and disinformation for decades, but now have the tools to truly weaponize their efforts” with social media.
“The important thing to remember is that their goal has remained consistent— it is to sow discord and weaken the bonds between the U.S. and its allies,” Sipher, a Cipher Brief expert, said. “It does this by confusing issues, making it harder for Americans to discern truth from fiction so that they throw up their hands and assume that all information is suspect. In so doing, they weaken the bonds necessary to maintain a democratic state.”
Russian President Vladimir Putin ordered a cyber and influence campaign aimed at interfering in the United States election and boosting then-candidate Donald Trump’s chances, according to a declassified U.S. intelligence agencies’ report on Russian hacking and efforts to meddle in the 2016 election.
The assessment detailed Moscow’s multifaceted operation, described as a blend of covert intelligence operations and overt efforts by “government agencies, state-funded media, third-party intermediaries, and paid social media users or ‘trolls.’”
But Russian influence and disinformation operations extend beyond elections. Whether it is amplifying divisions in the wake of the the white nationalist rally and violence in Charlottesville, Virginia, or pushing anti-immigration stories, Kremlin-associated users, trolls, and bots continue to be active on social media platforms targeting users in the U.S.
Most recently, as Oklahoma Republican Sen. James Lankford pointed out during a Senate Homeland Security Committee hearing on Wednesday, “we watched, even this weekend, the Russians and their troll farms, their internet folks, start hashtagging out 'take a knee' and also hashtagging out 'boycott the NFL.’”
Laura Rosenberger, director of the Alliance for Securing Democracy and a senior fellow at The German Marshall Fund of the United States, said it is critical for Americans to understand that Kremlin disinformation operations are about boosting narratives and messages that could lead to chaos or to undermining faith in institutions. These influence campaigns are about exploiting divisions that already exist within a society, whether those are religious, racial, political, or other pre-existing vulnerabilities — and much less about propagating a pro-Russian point of view.
"This is not about ideology. This is about supporting extremist views, exploiting divisions, and sowing chaos. I think a lot of people when they hear about Russian influence operations, they think what we’re talking about is always about Russia. Really, most of what we see is sowing and exploiting of divisions and trying to take advantage of that. Most of the stories, the content, has nothing to do with Russia. That makes it even harder to recognize where it’s coming from,” she said.
As Jānis Sārts, director at the NATO Strategic Communications Centre of Excellence, pointed out, “the most effective Russian operations are when they use vulnerabilities.”
“In some case, it’s minorities, migration, corruption, immigration, social inequality. That is another thing where one has to be able, at the point that the moment of influence starts to happen, to separate what is a real debate within the society and that of the hostile influence who tries to hijack the debate and drive it to cause a particular effect,” he said.
The Russian tactics seen during the U.S. election, and after, are nothing new to Europe, Sārts, whose center essentially operates as a think tank focused on strategic communications, added.
“Most of that has been happening around Europe for quite some time,” he said. “So when we look at the U.S. election, the only issue is the size and the risk appetite that was probably something that was of surprise — but as far as the methods and interest to actually attempt to sway opinions, that is nothing new, at least in Europe.”
Rosenberger’s group, meanwhile, recently unveiled its online dashboard, Hamilton68, designed to track Russian influence operations on Twitter.
“What the Russians have been doing is trying to amplify or push messages, in many cases without people realizing the agenda behind that or who is even pushing that message. The idea with Hamilton68 is to show that this is what the Kremlin wants Americans to be talking about and thinking about, and help people decide based on that information how they want to assess those messages,” she said.
Rep. Will Hurd, a Texas Republican who serves on the House Intelligence Committee, told The Cipher Brief that “the tactics, techniques, and procedures of disinformation have been the same since Alexander the Great,” but the tools used to execute this have shifted in this media and tech environment.
“In today’s world, that’s social media. It’s existing platforms that we all use, whether it’s Facebook, Twitter, LinkedIn, you name it. Or, if it’s leveraging messaging in existing press — so planting stories and things like that. Those are the tools that they’re using. We’ve got to be mindful of it, and that’s why we should always be suspicious of anything that comes from the Kremlin or is talking about the Kremlin. And if you don’t know the qualification of the person you get the information from, think twice,” he said.
Mackenzie Weinger is a national security reporter at The Cipher Brief. Follow her on Twitter @mweinger.