Skip to content
Search

Latest Stories

cipherbrief

Welcome! Log in to stay connected and make the most of your experience.

Input clean

The Human Algorithm: Why Disinformation Outruns Truth and What It Means for Our Future

EXPERT PERSPECTIVE — In recent years, the national conversation about disinformation has often focused on bot networks, foreign operatives, and algorithmic manipulation at industrial scale. Those concerns are valid, and I spent years inside CIA studying them with a level of urgency that matched the stakes. But an equally important story is playing out at the human level. It’s a story that requires us to look more closely at how our own instincts, emotions, and digital habits shape the spread of information.

This story reveals something both sobering and empowering: falsehood moves faster than truth not merely because of the technologies that transmit it, but because of the psychology that receives it. That insight is no longer just the intuition of intelligence officers or behavioral scientists. It is backed by hard data.


In 2018, MIT researchers Soroush Vosoughi, Deb Roy, and Sinan Aral published a groundbreaking study in Science titled The Spread of True and False News Online. It remains one of the most comprehensive analyses ever conducted on how information travels across social platforms.

The team examined more than 126,000 stories shared by 3 million people over a ten-year period. Their findings were striking. False news traveled farther, faster, and more deeply than true news. In many cases, falsehood reached its first 1,500 viewers six times faster than factual reporting. The most viral false stories routinely reached between 1,000 and 100,000 people, whereas true stories rarely exceeded a thousand.

One of the most important revelations was that humans, not bots, drove the difference. People were more likely to share false news because the content felt fresh, surprising, emotionally charged, or identity-affirming in ways that factual news often does not. That human tendency is becoming a national security concern.

For years, psychologists have studied how novelty, emotion, and identity shape what we pay attention to and what we choose to share. The MIT researchers echoed this in their work, but a broader body of research across behavioral science reinforces the point.

People gravitate toward what feels unexpected. Novel information captures our attention more effectively than familiar facts, which means sensational or fabricated claims often win the first click.

Emotion adds a powerful accelerant. A 2017 study published in the Proceedings of the National Academy of Sciences showed that messages evoking strong moral outrage travel through social networks more rapidly than neutral content. Fear, disgust, anger, and shock create a sense of urgency and a feeling that something must be shared quickly.

And identity plays a subtle, but significant role. Sharing something provocative can signal that we are well informed, particularly vigilant, or aligned with our community’s worldview. This makes falsehoods that flatter identity or affirm preexisting fears particularly powerful.

Taken together, these forces form what some have called the “human algorithm,” meaning a set of cognitive patterns that adversaries have learned to exploit with increasing sophistication.

Save your virtual seat now for The Cyber Initiatives Group Winter Summit on December 10 from 12p – 3p ET for more conversations on cyber, AI and the future of national security.

During my years leading digital innovation at CIA, we saw adversaries expand their strategy beyond penetrating networks to manipulating the people on those networks. They studied our attention patterns as closely as they once studied our perimeter defenses.

Foreign intelligence services and digital influence operators learned to seed narratives that evoke outrage, stoke division, or create the perception of insider knowledge. They understood that emotion could outpace verification, and that speed alone could make a falsehood feel believable through sheer familiarity.

In the current landscape, AI makes all of this easier and faster. Deepfake video, synthetic personas, and automated content generation allow small teams to produce large volumes of emotionally charged material at unprecedented scale. Recent assessments from Microsoft’s 2025 Digital Defense Report document how adversarial state actors (including China, Russia, and Iran) now rely heavily on AI-assisted influence operations designed to deepen polarization, erode trust, and destabilize public confidence in the U.S.

This tactic does not require the audience to believe a false story. Often, it simply aims to leave them unsure of what truth looks like. And that uncertainty itself is a strategic vulnerability.

If misguided emotions can accelerate falsehood, then a thoughtful and well-organized response can help ensure factual information arrives with greater clarity and speed.

One approach involves increasing what communication researchers sometimes call truth velocity, the act of getting accurate information into public circulation quickly, through trusted voices, and with language that resonates rather than lectures. This does not mean replicating the manipulative emotional triggers that fuel disinformation. It means delivering truth in ways that feel human, timely, and relevant.

Another approach involves small, practical interventions that reduce the impulse to share dubious content without thinking. Research by Gordon Pennycook and David Rand has shown that brief accuracy prompts (small moments that ask users to consider whether a headline seems true) meaningfully reduce the spread of false content. Similarly, cognitive scientist Stephan Lewandowsky has demonstrated the value of clear context, careful labeling, and straightforward corrections to counter the powerful pull of emotionally charged misinformation.

Sign up for the Cyber Initiatives Group Sunday newsletter, delivering expert-level insights on the cyber and tech stories of the day – directly to your inbox. Sign up for the CIG newsletter today.

Organizations can also help their teams understand how cognitive blind spots influence their perceptions. When people know how novelty, emotion, and identity shape their reactions, they become less susceptible to stories crafted to exploit those instincts. And when leaders encourage a culture of thoughtful engagement where colleagues pause before sharing, investigate the source, and notice when a story seems designed to provoke, it creates a ripple effect of more sound judgment.

In an environment where information moves at speed, even a brief moment of reflection can slow the spread of a damaging narrative.

A core part of this challenge involves reclaiming the mental space where discernment happens, what I refer to as Mind Sovereignty™. This concept is rooted in a simple practice: notice when a piece of information is trying to provoke an emotional reaction, and give yourself a moment to evaluate it instead.

Mind Sovereignty™ is not about retreating from the world or becoming disengaged. It is about navigating a noisy information ecosystem with clarity and steadiness, even when that ecosystem is designed to pull us off balance. It is about protecting our ability to think clearly before emotion rushes ahead of evidence.

This inner steadiness, in some ways, becomes a public good. It strengthens not just individuals, but the communities, organizations, and democratic systems they inhabit.

In the intelligence world, I always thought that truth was resilient, but it cannot defend itself. It relies on leaders, communicators, technologists, and more broadly, all of us, who choose to treat information with care and intention. Falsehood may enjoy the advantage of speed, but truth gains power through the quality of the minds that carry it.

As we develop new technologies and confront new threats, one question matters more than ever: how do we strengthen the human algorithm so that truth has a fighting chance?

All statements of fact, opinion, or analysis expressed are those of the author and do not reflect the official positions or views of the U.S. Government. Nothing in the contents should be construed as asserting or implying U.S. Government authentication of information or endorsement of the author's views.

Read more expert-driven national security insights, perspective and analysis in The Cipher Brief, because National Security is Everyone's Business.

Related Articles

(Original Caption) 9/5/1963-Washington, DC- Flying over the Virginia side of the Potomac River, the impressive site of the world's largest office building crops into view. The Pentagon, which covers 34 acres of land including a 5-acre pentagonal center court, houses personnel of the U.S. Department of Defense, which includes the Departments of Army, Navy and Air Force. This bird's eye view also shows part of the 67-acre parking space area.

Assessing the Pentagon’s Mission to Rebuild the ‘Arsenal of Freedom'

DEEP DIVE — The Pentagon is waging war against its own acquisition bureaucracy. In a sweeping speech on Friday, Secretary of War Pete Hegseth [...] More

The Brave New World of Drone Swarms

The Brave New World of Drone Swarms

DEEP DIVE – A drone weapon heads behind enemy lines, on a mission to kill troops and destroy equipment. To its left and right are a dozen other armed [...] More

Hicksville, N.Y.: A Long Island Rail Road employee disinfects a train car with an eco-friendly cleaner while at the Hicksville, New York LIRR station on March 19, 2020.

Can High-Tech “Sensor Fusion” Revolutionize Biosurveillance?

DEEP DIVE – It’s the opening act in a potential public health nightmare: a chicken dies on a farm, for no apparent reason; another perishes at a farm [...] More

Ransomware: Protecting Yourself from Cyber Extortion

Former GCHQ Chief: Cybersecurity, AI, and the New Age of Multilateral Defense

EXPERT INTERVIEW — The last few months have seen a series of major cyber incidents which have frozen airports, crippled companies, compromised [...] More

Countering the Kremlin’s Five Most Effective Narratives About Ukraine

EXPERT PERSPECTIVE — In the summer of 2008, as Russian tanks rolled toward the borders of Georgia, the battle had already begun—shaped decisively by [...] More

Winning the Innovation Race: Why America’s Allies Are the Key to Beating Beijing

OPINION — Precision U.S. airstrikes against Iran’s nuclear program last June demonstrate American technological prowess built on decades of 20th [...] More

{{}}