We know that corporate boards can no longer afford to leave cyber to the IT team. In the future, (which is now) they need to be actively engaged in cybersecurity and they need to know how to do it from a strategic level. Particularly in a time when many companies are looking to either acquire or build a cyber component, board members need to collectively focus on understanding cybersecurity in private equity and M&A.
The Cipher Brief recently turned to Rick Ledgett, to share his perspective on these issues in a virtual gathering of experts from across the national security space for this member's only briefing.
Ledgett served as deputy director of the National Security Agency from January 2014 until his retirement in April of 2017, culminating a nearly 40-year career in cryptology at NSA and in the U.S. Army. He was the first national intelligence manager for cyber at the Office of the Director of National Intelligence and directed NSA's 24/7 Cyber Threat Operation Center.
The print version of this briefing has been edited for length and clarity.
The Cipher Brief: Can you tell us a little bit about what you’ve been learning since working with private-sector boards? What are the trend lines and new requirements for serving well?
Ledgett: This is a really important topic for me. When I talk to boards of directors and officers of corporations, it's top of mind for them. I'm on the board of M&T Bank as an independent non-executive director and a trustee on the Institute for Defense Analyses. And in my roles with both of those boards, I have spent a lot of time talking about these things. One of the questions that we used to get a lot was why should boards care about cyber? You used to just give it to the IT team, and they'd take care of it. The reason that you need to care is because it's your fiduciary responsibility for the corporation and you're also potentially liable.
There has been a spate of lawsuits that have been brought against directors and officers of corporations as a result of major breaches that allege malfeasance in some cases, or lack of proper oversight that boards are supposed to exercise over corporate activities. The lawsuits typically fall into a couple of classes. There are shareholder derivative lawsuits, and without getting too technical, that's when a shareholder typically brings a suit against directors or officers on behalf of the corporation over something the corporation could have done or should have done but didn't do for various reasons.
The other type of lawsuit is a class action lawsuit. In cases like the Yahoo breach, there were lawsuits brought that were dismissed against everybody except the CEO, who was alleged to have known that there were problems and didn't address them.
There have been suits brought against Target because of their breach, Home Depot, Wyndham Hotels, and Equifax, and most recently, LabCorp. LabCorp is interesting because they had two breaches, the first of which was a third-party breach. It was a supplier to LabCorp, and the suit is trying to hold LabCorp liable for actions of a third party. That's a very big deal because, in most cases, most companies have lots of third-party relationships. I know my bank has 5,000 or so third-party relationships. Understanding the cybersecurity posture of all of those people is a huge job, and a really important one. You need to care because you're potentially liable and while there is director and officer (DNO) insurance, there are limitations to that and that's going to get more expensive and it might not cover everything. So, you really do need to pay attention in the boardroom.
The current COVID-19 pandemic is making that more important. Why? Because a lot of people are working from home, and when you work from home, you're much more vulnerable because you are typically operating off the corporate infrastructure. If you're really lucky, your company has the ability to remotely VPN in using a virtual desktop infrastructure in order to operate just as if you were on the corporate network, but many corporations don't have that capability. So, you might have a corporate laptop, you might not have a corporate laptop. You might be using a personal device and connecting into the corporate network, or in some cases, connecting directly to cloud-based services from your home network. That opens up a host of vulnerabilities in cybersecurity that cybersecurity staff really need to be aware of and stay on top of.
Because of the limitations of what you could do with a work from home scenario on a potentially untrusted infrastructure, you can't do everything you would want to do in terms of monitoring or in terms of controls, but there are other compensatory measures that you can take, like insisting on multi-factor authentication or insisting on robust auditing, or in some cases - for a transfer of funds - requiring two and three levels of approval in order to get that done. So, you sacrifice agility to make it possible to operate more securely.
The other issue with work from home that we're seeing is the impact of distractions. When you're sitting at home, you've got kids, dogs, a lawn that needs to be mowed, or someone ringing the doorbell. You have distractions that you don't have at work. And because you're at home, psychologically it feels different and you might not be as alert and forward-leaning with your cybersecurity posture as you would be in an office space. That's an important thing to keep in mind. We're recommending that folks remind their employees more often than usual not to click on links and to be careful who they're sharing information with while they're working from home.
Then, of course, we've seen a lot of press about COVID-themed lures to get people to click. It might be by offering the latest numbers if you click a link or offering the latest information on drug testing. Of course, those campaigns are driven by criminals who pivoted very quickly - within a day or two - of the pandemic becoming widespread.
Whatever their goal, whether it's criminal, to extort money through ransomware, or to steal PII, or to steal intelligence information, they're going to use whatever topics they can to get people to click on so they can steal their credentials and gain access. It's no surprise that they're using COVID-19.
So, what do you do about this if you're a director?
Rick Ledgett, Former Deputy Director, NSA
In general, directors should ask questions and insist on coherent answers in a language they understand. What sometimes happens is that people will talk to their CSO or their CIO, and they'll get a bunch of details about the NIST Framework. That's important for a certain group of people. It's probably not important for directors. Directors need to know how to characterize the risk. Just like they look at other risks, legislative risk or regulatory risk, or in the case of banks, credit risk.
You need to look at cybersecurity in terms of what are the risk factors. It's not one big blob that says cybersecurity risk, and it's not a hundred different things that say we're at risk with our firewalls and at risk with our intrusion prevention systems. It's looking at the risk of unauthorized, undetected use of stolen credentials. What's our risk of insider threat? What's our risk of losing control of our assets and not being able to recover? What's our risk of being victim to a ransomware attack?
There are a lot of things underneath that, that aggregate to it, and depending on your organization, it can be hard to get others to think that way. That's where a risk committee can be important. Having someone who understands cybersecurity on the risk committee can be really helpful to help frame that alongside the other risks. Then, when it goes to the full board, you've got a reasonable, coherent discussion in a language that board members can understand that relates to things they're used to talking about.
It's also important to realize absolute security is impossible. If a high-end nation state adversary or a high-end criminal group wants to get into your network, they're going to be successful given enough time and effort. So, how do you detect when that happens? I always credit Dmitri Alperovitch with this model when I use it, it's the 1:10:60 model. In one minute, I want to be able to detect that there is an unauthorized user on my network. In 10 minutes, I want to be able to isolate that activity and in 60 minutes, I want it off my network. That's completely impossible. No one in the world can do that now, not anybody, but it's a great goal and it's a great way to frame it when you're talking to boards.
It's easy for them to understand and then you can nest things like the NIST framework and communicate what you're doing - the programs you're putting into place - to support the 10 minute isolation or the one minute identification or the 60 minutes to get them off the network goal. It sort of gives you a spot on the wall to aim toward.
Another thing boards should insist on, is having a cybersecurity plan and executing it. Practice, practice, practice. You need to practice it at the tactical level with people who are charged with implementing it. You practice it at the management team level so management gets used to the language and the decisions they have to make, and you practice it at the board level. Do all three of those things and have a regular rotation on that. It's really important because you may have a cybersecurity plan, but if no one's looked at it in a year or two, it's not going to be very helpful to you. Practicing lets you make decisions in a way that you're not able to do easily under pressure with people demanding answers right now. You can think about things, staff them, come up with reasonable answers, vet them and then say, "Okay, for option A in our plan, we've pre-made this decision and now I've got additional time," because if you're in a crisis, what's the thing you need more of that you can't get? More time. The way you get more time is by practicing and then you compress the timeline on decision making where you can, so you can spend your time on the unforeseen things that arise in a crisis.
The penultimate thing is understanding your third parties and holding them accountable. Third party risk management is a big issue in the financial sector. How do you understand their vulnerabilities? How do you hold them accountable?
The final thing has to do with contracts. When you write contracts with providers or for other suppliers, include breach reporting requirements. You may have heard of a cyber event that was reported a while back called Cloud Hopper. That was Chinese Ministry of State security guys who compromised several large managed service providers. One of them was HP Enterprise, and it came out in the reporting that HPE knew about the breach and didn't tell any of their clients. Actually, the people on staff were told, "Don't tell clients about that." That's just unbelievably bad behavior and you should write in your contracts with providers like that they're required to notify you within X number of days of a breach of their system.
The Cipher Brief: Here's a question from General David Petraeus. "Thank you for your decades of exceptional service at NSA in ODNI, Rick. Do you have a generic model in plain English that lays out the elements you believe should be present in any comprehensive, integrated, and operated cyber security solution? And if so, can you share it with us?"
Ledgett: First off, thank you and same back to you, General Petraeus, for your decades of service. I think that the NIST framework nails it. It's: identify, protect, detect, respond, recover are the gross categories and there is a lot of stuff within that and lots of different frameworks take that and massage it and call it different names, but those are the core things you really need to be able to do.
At a macro level, it's having an understanding of the business and what you need to protect. If you are running a production line that has automated things with PLCs in it, and automated processes, that's different than if you're storing a bunch of people's PII and you want to protect that. Or your key thing may be your intellectual property or your client list. All of those require a different sort of protection profile. Understanding your business and having a cybersecurity plan that addresses the technical aspects of it, and then having a culture that supports cybersecurity - and that's includes culture in the workforce and culture in the management team and culture at the board level - is key.
The Cipher Brief: Ambassador Joe DeTrani asks, "How well is the government doing in briefing in a timely way those US companies that are being actively targeted by a foreign entity? And is there an offensive option for these US companies to respond to such attacks?"
Ledgett: The briefings are typically done when the government discovers that there is targeting going on. It's usually the Department of Homeland Security or the FBI that tells them. In the case of my old organization, when we would identify that through our intelligence sources, we would wash it through DHS or FBI to take the source of it away, but basically tell them, "Hey, this is going on. If you need specifics, look in your logs at this IP address at this time and you'll see the activity that we're talking about." That happens. I think there's a scaling issue perhaps where the number of folks to do that might not be sufficient to the number of things that are detected. I'm pretty confident that the worst ones - the most significant ones - do get relayed to the company in one form or another. I've done some of that myself in my previous life.
I think that the second question about attacks from the commercial sector is a really, really bad idea for a couple of reasons. It comes up periodically and there's been legislation introduced for it, but I think the three key points that need to be made are: attribution is often wrong when it's done without the full array of intelligence sources and methods. Something will look like it comes from one place, but it's actually coming from somewhere else.
For example, there was the attack on the French TV station after they published some things that were uncomplimentary of President Putin. It was a Cyber Caliphate that supposedly did that, but it was actually the Russian GRU that did it. For a long time in the public sector, they were attributing the attack to this new Cyber Caliphate group. They got the end point wrong.
Rick Ledgett, Former Deputy Director, NSA
The second point is that if a private company picks a fight with a nation state adversary, they will lose and they'll lose badly. And they'll lose not just in cyberspace, but they can lose in other ways as well. Look at the Russians and what they did in the UK. They sent assassins to kill people by poisoning them on UK soil. What makes you think if you're some random US company that decides ‘I'm going to hack back’, that they won't do something like that to you, if you make them mad enough? You're going to lose that fight every single time.
And the third case is, some nations view offensive cyber activity as an act of war. So, I don't want a US private company potentially provoking an armed conflict with another nation. They won't care that it's company X, they'll say the attack came from the US and say the US did it.
The Cipher Brief: Larry Pfeiffer from the Hayden Center at George Mason University asks, "What are your thoughts on integrating security across IT, personnel and physical? And does that afford greater protection?"
Ledgett: My bank does that. I think it's brilliant. To really protect the banks, your information, and your corporate information, you need to have physical security. IT security and personnel security need to be treated in a unitary fashion. It does no good to have the best cybersecurity controls and software and processes on your system if there's a server by an unlocked door that someone can open and stick a thumb drive in there, or pick the server up and walk off with it. You need to integrate those things. It's a best practice.
The Cipher Brief: Leslie Ireland says, "Hello, Rick, what would be the benefit of greater information sharing between the government and the private sector on trends, intentions, and capabilities in addition to sort of the specific events?" She adds that, "Private corporations likely are seeing activity that the US government may not be privy to, so greater sharing could be a win-win."
Ledgett: I agree. And actually, I'm on the National Infrastructure Advisory Council, and in December we released a recommendation to the president to do just that in the critical infrastructure sectors. We said, "We'll start with electrical power, telecommunications, and the financial sector, and have an information sharing center that does more than the former DHS NCCIC was supposed to do." But basically, has high-end talent from the private sector who are cleared and given access to information that they can then say, "This is important to the private sector because of this."
Because quite frankly, when I was operating on the government side and we had threats directed against critical infrastructure, we didn't know whether this particular operational capability was relevant because it was directed against a specific piece of IT gear. Well, do the banks or do the telcos use that particular system? I don't know. I don't know what their architecture looks like. But there are people who do, so we marry them up with that information and let them do it. Most of the attacks on the US happen on private sector networks, so greater visibility there helps to mitigate those attacks.
And if I can knit things together, there's this idea of collective defense that Keith Alexander talks about that I think is exactly right, where you say, "I see an attack on one network in this sector. Everybody in the sector should know about that right away." And then cross sector knowledge as well, because if they're doing it here, eventually they may do it elsewhere.
The Cipher Brief: We spoke recently with General Alexander about collective defense, we'll be publishing that interview in the Cipher Brief as well. It's a great point. And one of the things that DHS has tried to do is be that connect point for business, particularly businesses that work in critical infrastructure. Is government set up the right way to deal with this, given the authorities, the capabilities, and the ability to really connect with the private sector companies and have enough trust there for an exchange of information? Or are there things we should tweak?
Ledgett: The NIAC works with DHS and is under the DHS umbrella in its CIPAC role. What you're describing, is actively what we're trying to do in order to make it possible to exchange. The other thing that the NIAC recommended is a regulatory authority - something like the Nuclear Regulatory Commission has over the nuclear power plants - but without taking away from the SEC and the FCC and the other sector regulatory authorities, but there are gaps there in terms of cybersecurity coverage. And depending on where those gaps are, either operate within those gaps or provide that cybersecurity expertise to the SEC, the FCC, and the other ones to fill that in so they can provide coherent and useful cyber guidance.
The Cipher Brief: Let me just ask you a final question on a slightly different issue, which is, with more people staying at home in this time of great stress on personal and economic and political levels, are we sitting ducks for disinformation campaigns right now?
Ledgett: That's a great question. I think we could be, but we could also take this as an opportunity to get better at not falling for disinformation. Disinformation is such a complex topic, and it's so hard to protect people's minds from information, especially information they self-select as meaning something. A better job by the social media companies in order to tag and identify suspect information would help. They’ve started doing that and I think it's a good thing. This is a place where the government needs to not get involved, because there's a slippery slope between tagging information as invalid and censorship, and we don't want that, Certainly not in this country. But I do think that there's a role for people and local governments to play, to help broaden your mind a little bit. Don't just watch one cable news channel, whether it's MSNBC or Fox, it doesn't matter, or don't just subscribe to one feed. Look at stuff that makes you uncomfortable and read it and think about it. Then apply the plausibility test, does this sound plausible, or does this sound crazy?
The Cipher Brief: Thank you for this excellent briefing and for taking the time to answer all of our questions.
Ledgett: Thank you. It's always a pleasure to talk with you.
Cipher Brief Members can join us each week for the opportunity to ask questions of the experts. Know someone who would benefit from membership, or interested in joining the dozens of multi-national organizations and government offices that have enterprise memberships? Drop us a note at info@thecipherbrief.com for more information.
Read more expert-driven insight, perspective and analysis in The Cipher Brief