Leveraging Uncertainty for Strategic Cyber Deterrence

By Gentry Lane

Gentry Lane is the CEO & founder of ANOVA Intelligence, a cyber national security software company. She is also a Fellow at the Potomac Institute for Policy Studies and a Visiting Fellow at the National Security Institute at George Mason University’s Antonin Law School.

If the right to reasonable privacy is a nonnegotiable, weight-bearing pillar of democracy and omniscient surveillance is the hallmark of authoritarianism….is there an in between?

Gentry Lane is the CEO & founder of ANOVA Intelligence, a cyber national security software company. Ms. Lane is also a Fellow at the Potomac Institute for Policy Studies, and a Visiting Fellow at the National Security Institute at George Mason University’s Antonin Law School. She is a recognized subject matter expert on cyberconflict strategy, and advises members of Congress, NATO and U.S. defense and intelligence agencies.

PRIVATE SECTOR PERSPECTIVE — Legislators are calling for mandatory disclosure of cybersecurity events in previously unregulated industries. On the surface, this seems like a reasonable way for defense and intelligence agencies to acquire more data on adversary activity in the civilian sector. With more data on hand, more actionable intelligence can be generated. But this is true only under certain conditions. In reality, this type of data acquisition and synthesis is quite complicated as the input data must be uniform and pristine or the resulting intelligence will not be accurate. Legislators who expect under-resourced security teams with disparate discovery and verification protocols to produce timely, untainted, unified data show a naïve understanding of what it takes to turn raw data into viable intelligence. Pristine data obtained via required-disclosure regulation is an unreasonable expectation which will yield unviable intelligence.

The alternative is surveillance, or automated, first-party collection of raw data for synthesis into actionable intelligence. When authorities acquire data directly via first-party collection methods, data integrity and signal fidelity are more likely to be intact, resulting in more viable, accurate, actionable intelligence. But Americans have a hypocritical relationship with surveillance. While it’s highly objectionable for heavily-regulated government authorities to conduct domestic surveillance for the prosocial purpose of national security, there is little objection to the largely unregulated private sector conducting granular, persistent digital surveillance for the purpose of promulgating consumerism.

In America, surveillance is situationally acceptable. Privacy is most valued when there is a perceived risk of discovery of illicit behavior and/or if culpability is present. Privacy is not valued when potential culpability is not a factor, and any measure of law compliance or risk of arrest are absent.

Surveillance, in some form, is fundamentally essential for cyber national security. Defense and intelligence agencies require timely insight into advanced persistent threat (APT) activity within the inviolable homeland to uphold their security missions. But even with express consent, omniscient surveillance is impossible at national scale, and even more untenable given the exponentially expanding cyber domain attack surface and automation of APT aggression. There are simply too few eyes for the scope, scale and frequency of security events.


The Cipher Brief Threat Conference Oct 24-26 is the only event of its kind that hosts expert-level conversations on public-private cooperation on national security-related issues.  Join the conversation by applying for a seat at the table today.


A similar conundrum was addressed in the work of 18th century philosopher and social reformer Jeremy Bentham to address surveillance and management of the booming British prison inmate population. Bentham is the architect of the Panopticon, a prison structure which permits a single guard to observe thousands of inmates at once. The structure featured a novel architectural design which afforded the opportunity for visual surveillance of prisoners at a previously unattainable scale. The design was reliably effective at deterring unruly behavior in a delinquent population.

The key to prisoner compliance in Bentham’s Panopticon was not omniscient surveillance, but uncertainty. Even with blatant understanding that a guard could not always see all individual prisoners at all times, the uncertainty over when and where observation occurred was the deterrent that created a state of orderly compliance.

The parallels between Bentham’s Panopticon and current cyberconflict are notable. In Bentham’s Panopticon, the prison guard’s objective was not omniscient surveillance, but instead the purpose of surveillance was to assure persistent, sustainable, orderly incarceration. In other words, business as usual. Similarly, defense and intelligence agencies require insight into private space, not for omniscient surveillance or for law enforcement purposes, but to obtain APT behavior data required to assure a state of persistent, sustainable, national security. In other words, to assure business as usual in this dual-use domain.


The Cipher Brief hosts private briefings with the world’s most experienced national and global security experts.  Become a member today.


Another striking similarity between Panopticon prisoners and APTs is the relative indifference toward credible threats of punishment of those under surveillance. The threat of punishment is less threatening to both those already serving a punitive sentence and to those outside the reaches of the American legal system. Despite this indifference, the panoptic surveillance and inherent uncertainty deterred noncompliant behavior untethered to any explicit threat of further punishment.

At present, APTs are successfully exploiting the size, scale and relative anonymity afforded by the cyber domain. They know hardly anyone is watching and they know where the watchers reside. Their tactics reliably evade federal authority surveillance/data collection methods and rush to take refuge in endpoints, the most intimate area of the attack surface, specifically because they know that USG surveillance is absent there. The current U.S. cyber warfighting force availability does not now and could not ever scale to the level required for omniscient surveillance. It’s Bentham’s conundrum: There are simply too few eyes for the scope and frequency of aggression.

This begs the question what would a panoptic construct look like in cyberspace, a domain where forensic surveillance is available, but traditional visibility is not? It would certainly require visibility into domestic endpoints, the safe harbor where APTs reside free from observation. Privacy-preserving artificial intelligence and machine learning (PPAI/ML) provides part of the solution. PPAI/ML employs techniques that assure privacy at every level of analysis. Different than anonymization which can be reverse engineered to assign attribution, PPAI/ML is truly indifferent to the identity and digital content of those being monitored.

Panoptic surveillance requires demonstrative proof and persistent accountability that any measure of law compliance or risk of arrest are absent. This does not mean blanket immunity from criminal behavior by the party under surveillance, but public accountability and affirmation of the surveilling authority’s indifference toward criminal activity, supported by lack of enforcement.

Assuming a panoptic construct is implemented at scale, how would defense and intelligence agencies leverage the ensuing uncertainty to assure “business as usual”? The uncertainty only has it’s intended effect if APTs understand that they are under surveillance. The paucity of traditional visibility inherent to the cyber domain, requires easily-verifiable proof of panoptic (ie: if you can’t physically see the jailer, how can you know they’re at their post?).


Go beyond the headlines with expert perspectives on today’s news with The Cipher Brief’s Daily Open-Source Podcast.  Listen here or wherever you listen to podcasts.


Leveraging uncertainty around mission success, detection and attribution, and how much the United States is willing to tolerate as a deterrent, is underemployed in current cyber national security strategy. For APTs, the destabilizing effect of uncertainty when calculating cyber operations cost-benefit calculus is a novel variable.

For American companies under cyber assault, panoptic surveillance is a middle option to the right of authoritarian omniscient surveillance and to the left of ineffective, self-reported mandatory disclosure. Finding the balance between a highly desirable ends (a cessation or significant de-escalation of cyberdomain hostilities) and a necessary but odious means (surveillance) is complicated. But since diminished reasonable privacy is acceptable when certain conditions are met, and because the deterrent power of uncertainty for behavioral compliance in the population that is indifferent to the threat of punishment is relevant, panoptic cyberdomain surveillance merits further consideration.

Have a private sector perspective to share?  Drop us a line at [email protected]

Read more expert-driven national security insight, perspective and analysis in The Cipher Brief


Related Articles

Israel Strikes Iran

BOTTOM LINE UP FRONT – Less than one week after Iran’s attack against Israel, Israel struck Iran early on Friday, hitting a military air base […] More

Search

Close