Skip to content
Search

Latest Stories

Welcome! Log in to stay connected and make the most of your experience.

Input clean

The AI Threats to Elections You Should (and Shouldn’t) Worry About

The AI Threats to Elections You Should (and Shouldn’t) Worry About

SUBSCRIBER+ EXCLUSIVE REPORTING — With artificial intelligence getting smarter every day, cybersecurity experts, election officials and voters have been fretting about the possibility that malicious actors — at home or abroad — might use these automated tools to plunge the 2024 U.S. election into chaos.

Intelligence officials recently warned lawmakers that Russia and China are using AI to sow division in the U.S. Nearly half of Americans believe that AI-generated content will interfere with this year’s election process. And as residents of more than 50 countries head to the polls this year, elections across Europe and Asia have already been rocked by AI.

Keep reading...Show less
Access all of The Cipher Brief’s national security-focused expert insight by becoming a Cipher Brief Subscriber+ Member.

Related Articles

The Houthi Balancing Act After Israel’s Attack on Iran

OPINION — How will the Houthis respond to the devastating Israeli strikes on Iran? This moment could prove decisive for both the Iran-led axis and [...] More
The Attack that Knocked Back Iran’s Nuclear Program

The Attack that Knocked Back Iran’s Nuclear Program

EXPERT INTERVIEW — Israeli airstrikes against targets associated with Iran’s nuclear program early Friday have seemingly dealt a devastating blow as [...] More

Dead Drop: June 13

SHOULD GREENLAND AND PANAMA BE (EVEN MORE) NERVOUS: Defense Secretary Pete Hegseth was engaged in a war of words this week with lawmakers during a [...] More
Report for Friday, June 13, 2025

Report for Friday, June 13, 2025

9:07 AM America/New_York Friday, June 13 [...] More

China Wants Our Hearts. Literally.

OPINION — China is pre-positioning itself on U.S. networks for disruptive and destructive attacks against our critical infrastructure. In the past [...] More