In the summer of 2016, then-Director of National Intelligence Lt. Gen. James Clapper began to detect a worrisome trend. It was an election year, and he was starting to see an uptick in Russian activity around misinformation and disinformation that went well beyond the usual playbook. Russia had been known to carry out activities using disinformation ahead of previous US elections so it wasn’t shocking at first. The then-DNI and other leaders within the Intelligence Community expected a certain amount of ambient Russian activity. But not like what Clapper was seeing starting to play out.
“Russia has a long history of interfering in their own and others’ elections, but never on a scale, aggressiveness, or breadth of scope as they did in our 2016 election,” Clapper told The Cipher Brief last week.
Clapper wrote extensively about what he saw in his 2018 book, Facts and Fears: Hard Truths from a Life in Intelligence, including how Russian hackers began employing tactics around social media and how quickly they were spreading across the accounts of US persons, which introduced a problem for the IC. “The reason we were slow on the uptake in that is that there is a reticence about monitoring U.S. communications, even if they’re publicly available,” Clapper told us.
He still had the details of Edward Snowden’s theft and leaking of intelligence secrets in 2013 fresh in his mind and the clear understanding of how messages about what the government was actually doing, and under what authority, could be easily lost in the media noise.
“After being burnt by Snowden on that sort of thing, I wasn’t as aggressive as I should have been in pushing the community to pay attention to what was going on in social media,” said Clapper.
What Clapper was detecting in 2016 and what it is today, has been fueled by another growing phenomenon of what a recent RAND study refers to as ‘truth decay’: the diminishing role that facts and analysis have played in America – not just over the course of one presidential administration – but over the last two decades. RAND defines ‘truth decay’ as an increasing disagreement over facts that tends to blur lines between what is someone’s opinion and what is fact and how opinion now seems to have more influence that fact in American society. The downside is that the result has been a decline in trusting what were long considered ‘trusted sources’.
“There is a phenomenon where we disregard facts, empirical data, and objective analysis, and this is only proliferated by social media where people live in different reality bubbles,” Clapper told us. “This problem is not going to be able to be solved by a whole of government approach, it needs a whole of society effort.”
What would the result of that whole-of-society effort look like? Clapper has an idea about that. “The basic format may be modeled after intelligence community products. The PDB [Presidential Daily Brief] could be one format. That would be a one-page article and would certainly be useful. I also think more in-depth products like the NIE [National Intelligence Assessment] would provide more depth. Overall, I would model these products after what the community produces now and there are a variety of formats and templates that could be used for that.”
Let’s talk about it. Read the Background Brief below and then join us Wednesday, February 24 at 1:30p as we get a briefing from Lt. Gen. Clapper on why something like this is needed in today’s intelligence community and what’s at risk if we don’t get this right. Members receive registration links via email. Not a member? That’s an easy fix.
Background Brief: A Key overview of Misinformation and Disinformation
Misinformation is the communication of false or misleading information without a specific intention to deceive.
Disinformation is a specific type of misinformation in which there is an intentional dissemination of false information in an attempt to deliberately deceive. Simply put, disinformation is the crime with the intent. It becomes crucial in understanding the difference between these two overarching terms.
- A study conducted in 2018 by researchers at MIT found that “false news” spreads faster than real news on social media. The study which focused on the platform Twitter, highlights a rising concern, that misinformation by its very nature is more susceptible to virality. MIT
- The likelihood of false information posted on Twitter being retweeted and shared was found to be more than 70%. MIT Sloan
- Technology has enabled foreign actors to covertly introduce division in democratic institutions across the globe.
2020 U.S. Presidential election:
Russia’s efforts to influence the 2020 U.S. presidential election were less successful than in 2016, due in part to the coordination between the FBI and social media companies like Facebook and Twitter in identifying and removing foreign-based disinformation accounts. (NBC)
- Leading up to the 2020 U.S. presidential election, Russia’s Internet Research Agency recruited American freelance journalists to write articles for a Russian-owned news site, called Peace Data, that were designed to divide Democratic voters. (Washington Post)
- Russia amplified claims of voter fraud through mail-in voting in an effort to cast doubt on the legitimacy of the 2020 election. (FPRI)
Recent U.S. government approaches to disinformation:
- 2017: Russia’s Sputnik News essentially led a disinformation attack, causing trouble for U.S. forces in Germany. This prompted the U.S. Army’s Europe Command to combat the spread of disinformation by creating a specific team known as the Mis/Dis Tiger Team. C4ISR
- 2019: DoD had been steadily revamping their information operations capabilities given the increase in misinformation since 2016. The U.S. government’s view of conflict has made it difficult to combat misinformation campaigns that target the US civilian population. C4ISR
- 2020: The U.S. and its allies made efforts to message people in Iraq and Syria in an effort to focus on their audience in the region, build relationships, and reach out to journalists. The conclusion of the effort was that while social media disseminated information faster, news outlets reached more people. This became an approach in 2020, with some suggesting that troops on the ground be given cameras and reliable WiFi so they could upload content that could combat disinformation efforts. C4ISR
U.S. tech firms in combating misinformation:
- March 2020: The World Health Organization discusses the pandemic as an information crisis as much as a health crisis, saying that disinformation about COVID-19 was spreading faster than the virus itself.
- Sept 2020: Twitter publicly announced that they would label and remove posts that spread disinformation.
- Oct 2020: Twitter began altering labels regarding misinformation and attempting to address violators in a more-timely manner. Reuters
- Nov 2020: Facebook, Twitter, and Google joined the non-profit Full Fact in an effort to better combat misinformation regarding Covid-19 and conspiracies surrounding it. The Guardian
- Jan 2021: Google News Initiative launched a project against misinformation regarding the COVID-19 vaccinations. A large part of the program is dedicated to fact checking and disseminating information to groups that are commonly targeted by misinformation campaigns. Reuters
Cipher Brief Interns Maxx Annunziata, Alexis Laszlo and Brian Hoffarth contributed research for this piece.
Join us for a private briefing on Misinformation with Former Director of National Intelligence Lt. Gen. James Clapper (Ret.) on Wednesday, February 24. Cipher Brief Members receive registration links via email.