Ex-CIA officer Gus Hunt: Shifting towards data-centric security
Recent technological advances and the resulting growth of cyberspace have created new threats to our safety and security. To overcome these, we must build on our experience as explorers while putting safety and security first. Encryption has become the go-to method for protecting our digital lives and its importance has grown dramatically due to the explosive growth of our digital lives over the last year.
In the latest episode of under CTRL, we sit down (virtually) to chat with prominent cybersecurity expert and former CIA executive Gus Hunt, who takes us through some of his personal milestones in both cybersecurity and space technology.
Driven by the desire to keep abreast of the latest trends and having held a variety of exciting positions – from being an Aerospace Engineer to CIA’s and Accenture’s technology leader –, Gus has had the rare opportunity to drive adoption of new technologies in both the public and private sector.
With this illustrious resumé in mind, we were sure that our interview would take us on a journey – and we were not let down.
Take a look at an overview of the topics we discussed below:
- The main trends in the new commercialized space race driven by wide-scale automation and the use of artificial intelligence
- The parallels between methods used by intelligence services and commercial data exploitation for decoding patterns and behaviors
- Gus’s view of the invalidated EU-US Privacy Shield and why offering additional privacy protections to European companies operating in the US could solve this dilemma
- Gus’ stance on the backdoor debate around allowing law enforcement lawful access to encrypted data
- And finally, we wouldn't want to miss out on getting some insights into the motivations of whistleblowers
Want to hear more about game-changing digital innovation? Check out the last episode of under CTRL about digital projects at Cologne Bonn Airport – and stay tuned for more podcast episodes on Spotify. You can also stay connected with all things Tresorit through Twitter and LinkedIn.
Paul: Welcome to "under CTRL"! I'm your host Paul Bartlett, and on today's episode we will have Gus Hunt, former CIA executive. Gus spent a good part of his career with the CIA as former CTO, and then worked with Accenture as Managing Director for Federal Services Cybersecurity. We will talk about what it is exactly that intelligence agencies do not know about the public. Can you really be off the grid? Also, we will discuss the potential downsides of a backdoor on encrypted services. Hey, Gus! Welcome to the show!
Gus: Hey, Paul! Thank you! I'm delighted to be here.
Paul: Yeah. Good to see you! And as we mentioned just before we started going: I just see all these wonderful pictures on your wall about where you've been around the world. It's kind of making us feel a little bit travelsick, as it were. Not homesick, but travelsick.
Gus: Yeah, hopefully, we'll get to do it again soon. So...
Paul: That's- indeed. We want to be able to get that travel out there again. So... Gus, it's just- you've got a fascinating story and background. I just touched on some opportunities to look into, how you started off your career. One of the things that is close to my heart is this, you know, fascination with space. And I know that you've got also some affiliation with that in your early years. Been talking to my son about that quite a lot, about the future is bright. Because, you know, we're going to go off this planet. And we're going to do wonderful things out in space. So how could you just- maybe just give us a little introduction about yourself, and about your career. That'd be great.
Gus: Sure! Yeah, no, happy to do so. So as you mentioned, you know, I started my career back in 1979, working for Rockwell International, their Shuttle Orbiter Division. You know. So my first real job- full-time real job out of grad school was to work there. And it's always kind of been a passion of mine. I grew up through the 60s, right? When- with, you know, Kennedy's, you know, race to the moon, and his speeches, and things like that, rousing. And it had a big effect on me, as a child growing up, maybe you want to be an engineer, maybe you want to learn the sciences. You know, it really- you know, when we think about STEM today, impetuses like the- you know, the Polar Programme and things like that really get people excited about what's going on. And so one of the great things that I think space and space exploration has always done is ignite imaginations. And then this- and igniting the imagination in the young people tends to stick with them throughout their lives. And so that's- so, you know, that's how I got started, right? And doing all work in space. Mostly started at manned space flight systems, and then moved over to satellites and orbital transfer vehicles, and all those things like that. You know, along the way, working in the space side of the business. You know, the... space itself is, as you know, the- being viewed as kind of the new high ground, right? You know. Across the globe. And it is- it's such an essential thing for us to be able to have- be able to leverage and take advantage of for many things, right? So think of Global Positioning Satellites and, while they may have started as a military technology, today they are essential to the goods and services across the globe, right? Now airlines fly with them, farmers plant with them, you know, we track, you know, animal herds with it, right? And so these technologies take root, then they become part of our overall economic, and cultural, and what-not framework, to the good of many, many, many people. And that's- that I think is- as we're looking at space as we go to the future, those are the things that I get excited about, and see happening yet again, with this new resurgence now through all the commercial log systems and satellite builders and things like that. So...
Paul: Yeah. Yeah. And where did you graduate from? What was the- did you go to-
Gus: So I went to Vanderbilt University.
Gus: I graduated there in 1979, with my Masters in Civil Structural Engineering.
Paul: Right. Okay, and then your career took another direction completely, and you got involved-
Gus: Yeah, I would start working for CIA in 1985. And I started there as an analyst. And then worked on a variety of issues and everything from- in fact, I went back to do- work in space systems, right?
Gus: Because there- I worked counter-terrorism, counter-proliferation. And- you know, and those things. And then I moved over to the technology side of the house to help build and modernize, you know, the technology base that we operate from, much as all companies need to modernize their technology base. And that ultimately led to my becoming Chief Technology Officer, where I fundamentally brought in, you know, the cloud. You know, brought in the cloud, brought in, you know, AI at scale with IBM Watson, and, you know, those things like that. So it was a terrific place to work. Lots of energy, lots of creativity, lots of- lots of opportunity. So... you know, I didn't plan to be there 28 years, but when I look back on it, it was like over in a flash. So...
Paul: Yeah! Don't you scare me! When you say that, I feel like that's partly happening to me as well, that the time is starting to creep up on me.
Paul: So.. I mean, what was the- just curious, before we move on to the other questions...
Paul: What got you into the CIA? I mean, you were really in such a fascinating field already, and what was the draw for the CIA? Because it was kind of the edge of what you were doing? It was another step forward that draw you into that?
Gus: I think that's probably a great way to describe it, yeah. It was- I- it was... I had an opportunity to do an interview, and talking with the people there, I said, "Wow! This is just absolutely fascinating!" And the work that they're doing... I've always been very much, you know, interested in what are the National Security aspects of the stuff are, even in the space business and things like that. So it really played to what I wanted to be able to do, but also just the number of opportunities that were there, and the- as I said, the opportunity plays- and the creativity, and the opportunity to do something, you know, that could really make a difference all played well with me. So...
Paul: Yeah. And as we were saying just a bit earlier on, I mean, this is all around a time when obviously the governments were invested in the Space Race, right?
Paul: So that's probably- there was a lot of money. Now it's very much turned the tables. We were just saying that earlier. So, before we move on to other things, let's just stick with the space topic for a minute.
Gus: Okay, sure.
Paul: Because I think that's quite fascinating, it's that... back then, yes, of course, satellites. There was the Space Race going on, and then we saw some difficulties that obviously NASA went through. I also remember that in my time, with Challenger and things.
Paul: What do you think about things now? I mean, do you still have that enthusiasm and excitement for space? And do you see major advances now in the commercialization of space, to what it was back then?
Gus: So I- yeah, just a couple of things. Just as you mentioned the NASA Challenger. I was at Rockwell when the first space shuttle went up.
Gus: And watched the last space shuttle go up here a few years ago, you know. So, I went through the whole cycle of that. But when you look at space, I think space is evolving in the way that the rest of, you know, lagging behind a bit. But evolving in the rest of the way that I think the technology evolved in the rest of the commercial space. Right? So, you know, as we had mentioned earlier, you know, in the 60s, the government ran everything, right? And technology was driven by government. Because the cost of investing and driving these things forward, you know, was really a government thing. In the 70s and into the early 80s, it was a big business thing, right? And so big business (inconclusive). And starting in the late 80s into the 90s and into today, really it's driven by the commercial space. And the innovations that have evolved there have been absolutely legendary. And that opportunity space, that opportunity at play I think is what's emerging in the space business, which is the ability for others to think extraordinarily creatively about how- what capabilities can we deliver that space can deliver better than we can deliver terrestrially, you know? How do we begin to apply the technologies and the innovations that are on our terrestrial environment, and move them into space? Just (inconclusive) look at computing, right? Back when I started at Rockwell International, right, working on the Shuttle Program, you know, the most advanced computer, I don't know if- I don't know how old you are, Paul, but the most advanced computer was- there was Z80s. I don't know if you remember the Z80 computer processor.
Gus: And then we got the Motorola 60-800, you know. And now, today, look at where we are! And look at how much capacity you can cram into a very small package. And so that's where you're seeing microsats and small sats taking over in the environment and things like that. Because you can make them do things and cluster them, and you can do- you can cover the world much more effectively and other things like that than they could do before. So all this, I think, is at a nascent- but at a nascent point, where the application of commercial innovation and ideas is now- it's not just the government moving this. This is the commercial world can now think about how to do things in a new and different way. And once you can move technology at the hands of a much broader group of people, innovation just grows explosively.
Paul: Yeah. Yeah, yeah. And that's it. And I think the case what we're seeing with the likes of SpaceX and, I think, New Horizons...
Gus: Yeah, absolutely.
Paul: ...and Virgin Galactic as well, is that- we're literally on the cusp, or on the fringe of sending people- more people into space, as space tourism, right?
Gus: Yeah. Yeah. Yeah.
Paul: Because we've made it cheaper.
Gus: Well, you know... it's- what excites me is, you know, sending people in space is cool, and there are things that people can- only people can do in space. But I think the real explosion is going to be in- similar to automation, as it's happening here.
Gus: That's occurring, where... we can get technology and intelligence into space, (inconclusive) take advantage of things. So don't get- humans are exciting into space, and that's always going to be true. And perhaps colonization of the moon, or Mars, or whatever it's going to be is great. But the groundwork is going to be laid, I think, through intelligence and automation, much like you're seeing in the other aspects of what's going on. In social media, in the cloud, and AI, and those things like that. So, yeah.
Paul: Yeah. Yeah. So on that note then, let's come back down to earth. Before we get too carried away in the space race and... a little bit closer to home. And, of course, one of the things around the CIA and always fascinating is the amount of data that you collect, the things that go on. And we've all been mesmerized by Hollywood, and sometimes romanced by about what CIA does.
Paul: But, you know, the reality of it is probably very, very different. But one of the questions I wanted to ask you was: Is it- with today's technology out there, where we're just all on our phones, we're all connected some way or another. I mean, is there still a need to go further? To- are there people that are basically still on the grid, or off the grid? You know, for tracking people. I remember in the late 2000s, after Osama Bin Laden, you know, drones became very much more... into the mainstream media.
Paul: About what they were doing, and how they were tracking that. Because, basically, these people were untraceable or untrackable. So what's your perspective now on the kind of technologies that you see? And we see, of course, a tsunami of social media platforms, and news, and fake news, and things like that. So maybe you can just give me a little perspective about what you're seeing, and what you think about that?
Gus: What's been happening- and it is- the amount of information that is being shared by everybody all around the planet... I'm not sure people actually grasp what actually has happened here in the space. So when you look at the- as you mentioned, the social media platforms, okay? And what people are willing to share on that. When you- you go to the grocery store, and you use your discount card, right? And all that data gets collected. The fact that your credit card company knows more about you than you- most people even realize, right? Which is the fact that not only do they know what you've done, but they know what you're going to do, because if you buy an airline ticket or, you know, hotel, car reservation, things like that, right? If you drive on the highway, right? The toll roads, right? They know where you are. And just simply carrying a cell phone, because cell phones have, you know, the- particularly here in the US, it's mandatory, E911, which allows them to find you in case of emergency, right? So it's very difficult, using any form of technology today, to be truly off the grid. If you're going to be off the grid, you have to move to Alaska, move to... Siberia, move to something. Right? You know, have your own well, your own power. You can't buy power from anybody else. You know- you can't buy water, you can't buy power, you got to produce all of your own of that. You've got to have your own, you know, hunting. You've got to pay everything in cash. And even then, it's probably not 100% capable being truly, you know, off the grid. I- you know, a lot of people will try to do that, but it's just increasingly difficult to do. And like everything else in society, this is a double-edged sword, right? So the ability for, you know, the businesses to actually begin to tailor things that meet your specific needs, is one of the big goals that they have with all this data, right? So they can say, "What does Paul need? What does he want? How do we help, you know, Paul?" Now, granted "help" meaning to sell you something, but still...
Gus: You know, you'd rather have something that meets your needs than having to figure out, you know, get something that only partially gets you there. Right? Or something along those lines. So the objectives, I think, are really, really good, if you really think about it. And can be extraordinarily beneficial, from everything- from finding you effective health services, right? More- you know, effective- just, you know, effective services across the board, if you will. You know, that's great! Okay. But the bad, the dark side of this, of course, is that people steal this information and they use it in nefarious ways to take advantage of you, to steal your identity, to do what- you know. And that's just- that's the dual edge that we have. And the most important thing, I think, for people to realize is to think about what information they have out there. Do you really- do I really need to post everything I do on my social media account page? Right? Okay. Or should I show a little restraint sometimes, to make sure that I thought about these things, as opposed to just, you know, rapidly tossing it all out there? Those sorts of things. Because it's not necessarily in your interest to share absolutely everything in your life with anybody and everybody that wants to get on and take a look at your page. You know, and things like that. So... so, that's the problem, I think. And so the advantages are not- are terrific, but not understanding what you're sharing is there. And I think there's a real key responsibility for us as a society and us as citizens in our society to understand what it means to share information, and our information, so that we have thought about it. Right? And this is something that you have to teach kids in school, starting from the earliest days. And it's something that I talked to my kids about. About, you know, "Don't post-", you know, "Don't go out and post something really dumb out there, 'cause the internet is the elephant that never forgets." Right? It- you know, once it's out there, it's out there forever. And even under GDPR and the Right To Be Forgotten, you know, as people talked about. How sure are you that that's been forgotten? Right? And where is it? And how far did it spread beyond- you know. So while the social media platform may have deleted it, if somebody else has already downloaded it to their personal computer at home... you know, what do you do about that? Right?
Gus: So. It's out of your control. So those are the types of things that we do. And I think we have to teach this, and I think people need to think about it. So as much as they get taught, you know, "look both ways to cross the street," we need to teach them to look both ways before they post, you know, information out. Okay? Yeah.
Paul: Yeah. Understood. And I think- I mean, with your background with the Central Intelligence Agency, that's an organization that feeds off of intelligence, tries to gather as much intelligence as possible about users. And it's probably become a lot easier these days, especially, you know, going through the 2000s.
Paul: After the September 11th attacks, you know, the amount of information that we could collect, gather from these different let's say malicious organizations like ISIL and Al-Qaeda. But to a certain extent, they'll be using encrypted services, or trying to go under the radar, and making it obviously very difficult to track them down. But I just wanted to understand from your perspective, without obviously going too much into the actual methods of what the CIA do, but everybody is always looking for patterns. And that's a commercial aspect of now-
Paul: A pattern of usage or a pattern of behavior, which you're leaving a digital trail at, right?
Gus: Yeah. Right.
Paul: And I think that's something that's happening now. Is- you know, I mean, is that something that you just see where- yeah, it was a practice within your organization, but now everybody is kind of capitalizing it for their own commercial gain, on top of that, in companies?
Gus: So I would say that in- two things, you know. One is that I think there's a pretty profound misunderstanding about what intelligence can and does do. And here in the US at least, there's some really very, very strict laws about what information is allowed to be collected and not. Right? So those are strictly followed across the board. And there's no general giant vacuum cleaner out there that's going after, you know, all these things. Despite all the movies and everything else that you see that's going on. But the overall objective has remained unchanged since these organizations, you know, were founded, right? Which is that the objective is: How do you look after the equities of art, of National Security? And how do you ensure that you understand what's happening in the world, so that you basically- it's- whether it's National Security intelligence or it's business intelligence, you're objective is to understand what your competitors, what your adversaries, even what your partners are doing and engaged in, so that you can take appropriate steps that are meaningful and matter, and address issues before they actually become a crisis, would really be the goal. As opposed to having to deal with a cr- you know, getting what the DOD says: "Get way Left of Bang!" Right? So if I can stop things from happening before they become a, you know, shooting war or anything else like that, that's really the objective. So what's happening? Why is it happening? Understanding these things. And who is involved? The people... So, you're right, it's all about patterns. But the objective is to really make sure that these things are understood, and we can deal with them in an effective manner. So.
Paul: Yeah. And I think one of the things that I- probably disrupts that flow, when you think about that, and it was a big- let's say very much in the press last year, a big thing around these separation of the EU- or the EU-US ruling, the invalidation of the Privacy Shield.
Paul: So what was your perspective on that? I'd be keen to get that insight from you.
Paul: With what's going on.
Gus: Well, so as I understand it, okay, that- the Privacy Shield was invalidated as a result of the fact that the EU, the court, felt that under US policies and law, that law enforcement and National Security had - I'll use the word - too much access to data without sufficient control. Right? That was one. And that- then second, that there was no ability to have a recourse if somebody felt that that had happened to them. Right? 'Cause one of the rules of the Privacy Shield was to have a- you had to have ability to have a recourse to address your concerns, your grievances or whatever it's going to be. And those were the two issues that came out. I don't know how that's going to get resolved, right? Because I don't see the current policies around National Security changing very much. But there may be mechanisms by which you can provide additional assurances on top of these things that could help meet the needs, in order for commercial business in particular, to be able to share information between their European operations and their US operations or whoever it's going to be, right?
Gus: 'Cause that was really the objective of the Privacy Shield, was this ability for business to- at these multi-national businesses, to be able to integrate their information stuff, so they could be a more effective, you know, player in the market space, right? That's my understanding of, you know, what the Privacy Shield was about. And so I think that what will have to happen is- is that- the US Department of Commerce, who leads this, and the European courts are going to have to sit down and say, "What would be an effective set of additional protections that could be added on top that would make sense for us to be able to move back and re-establish the Privacy Shield as an effective mechanism to meet the obligations of the- European citizens and European law?" So...
Paul: Yeah. Do you- on that point, do you think that the US need something like GDPR on a federal level? Rather than- because there's a lot of protection on state levels.
Gus: Oh, absolutely!
Gus: Yeah, absolutely!
Paul: And they're all different, and I know that.
Gus: Yeah. No, that's the point. The last thing we need are 51, 52 , you know, if you include D.C. and Puerto Rico, you know?
Gus: So 52 different privacy laws all operating within the confines of the United States. That would be- I think that would be an unmitigated disaster here.
Gus: Nobody can operate a business effectively that way. So, yes. And so I think the good thing about the California Action and then the others that are in place, that that's going to drive that to occur here. And I think that forms a good basis. I also think the GDPR forms a very interesting basis for them to take a look at, you know, how- you know, how these laws can work, and how to make them effective within the environment here in the US. And I think eventually you'll see it. I'm not going to say it's going to happen this year or next year, but I think within five years at the most, you'll see- I'm hoping in less than three, you'll see some form of a US national, you know, privacy protection, you know, act come into fruition. So.
Paul: Yeah. Yeah. I mean, I just recalled when that kind of took place, and it was- the ruling had passed. Then, suddenly, there was some changes about accessing US companies' web pages.
Gus: Yeah, yeah.
Paul: And there was a restriction saying, "We cannot let you access it." So even for the customer, at the end of the day, who was mostly online, like myself and most people, you know, being able to access certain web pages was no longer available to them. You know, long term, it can be detrimental. So sticking, well, with the- with what we just said around GDPR... and what's going on in the US with regards to different laws... I mean, we've seen this 230 bill coming through. And we've seen the likes of Mark Zuckerberg being dragged in front of the Senate.
Paul: And, you know, talking about these things. But there's like- there was another bill, last year, which was kind of covertly going on, and kind of overshadowed by everything else, which is the bill for access to- like a backdoor access brought in. How do you feel about that? I mean, you advise businesses...
Paul: ...as well as being part on the government side. So you can probably see two aspects. So it's a double-edged sword again, but you can see it from both sides.
Paul: You know, why law enforcement and National Security need it, but also why businesses need- and individuals need to be protected. So what do you think about the law? I was just- (inconclusive) there's the Lawful Access to Encrypted Data Act. That's what it's called.
Gus: Yeah, Lawful Acc- yeah. So... by the way, I don't know if you remember, but this is the second round of this in the US.
Gus: There was a bill back in the- was it early 90s? I guess it was early, mid-90s, when they were trying to enforce a- what was it called? It was called the V- not the V-chip. That's the TV thing. What was- it was called the cl- not the clipper. What's the thing? Anyhow. But I can't- senior moment. Sorry about that!
Gus: But, yeah. They were trying to enforce a single encryption standard chip across all of the US that had a backdoor in it, so that they could make sure that law enforcement had appropriate access to information. And that failed back then as well, too. Right? So I'll start by saying that: Yes, I see both sides. I understand the concerns, particularly on the law enforcement side and things like that. But I'll also tell you that I think putting a backdoor in- mandatory backdoor into encryption is not a good idea. Right? I just- this technology is out of the bag. And any well-resourced cri- I don't know, criminal, network, you know, adversary, whoever it's going to be, knowing that the US systems that have these things in it, are all- they're just going to find an alternative, right? Which doesn't have a backdoor- or a- you know, punitively doesn't have a backdoor in it. And so what happens is that then the backdoor actually becomes, in my book, a weakness that could potentially be exploited to the detriment of business and law-abiding citizens and things like that. Right? Who need encryption to protect themselves. They need encrypted- strong encryption to protect the data. Right? Going back to our discussion on the Privacy Shield, right? So, you know, the fascinating thing about, you know, the, you know, sharing data, of course, is- or storing data is that if it's encrypted, it doesn't really matter what's stored, because it's- what matters is where it's processed, right? 'Cause when it's encrypted, it's just white noise. And so those- the ability to do that with confidence, I think, will bring things along, and maybe address some of the issues, you know, that we were just talking about previously. But inserting a mandatory backdoor - who controls it? Who has access to it? How do they get access to it? What's the core processes? I mean, there's a whole lot of issues here that are just not known. And then what happens if the key manages to get leaked some place? Okay? What do you do? You know, so there's just a bunch of issues around putting in a backdoor that just- in my book, just don't make sufficient sense. But I think the- I think given that encryption is ubiquitously available on a global scale - it's not a technology that is solely controlled within the United States, the United States' borders - that we're going to have to look for alternative mechanisms to ensure that law enforcement and others can get their jobs done effectively.
Paul: Yeah. Yeah, and I just thought about that today. I think it was either this morning or yesterday, I'd just seen the biggest one- the biggest mafia trials in history going ahead now in Italy.
Paul: And I'm sure that was a campaign to collect as much information as possible. And yet, as you mentioned before, the- encryption's been around, well, forever.
Gus: Well, since Roman times, actually.
Paul: Yeah. Yeah! But they've still managed- they feel that they still collected enough evidence from the sources that they have, and the information that they have, you know, but still with encryption in place, to make a prosecution.
Gus: Yeah. Yeah.
Paul: So... you know, is the case for encryption - especially this case that's going ahead in the US right now - is still really valid. And it's good, because we're advocates of encryption, but it's good to see both perspectives of that. I mean, we understand why law enforcement needs, you know, better- more information, better information, with some of the hideous crimes that have happened. But... it is a double-edged sword, where you potentially open it up for also those malicious actors to take advantage of that as well.
Gus: Yeah. Right. Exactly. And then just imagine, of course, the complexity if you're a business. Let's see... in this country, I can use this encryption, which doesn't have a backdoor. But over here, I have to have a backdoor over here. Have a different encryption that I have to use in a third country over here. It has a backdoor, but it's a different model. So... all this is really- I think weakens the overall ability for us to protect data, and protect our citizens really effectively from malicious, you know, use and things like that. So it's- I just don't see that the backdoor itself is a good idea. But I do strongly support doing additional investments and things like that. Right? And then finally, you should know, you know, assuming quantum computing takes off, right, you know, that the ability for encryption, your standard, traditional encryption, to remain valid, you know, people begin to already worry about that. And while we will have quantum encryption algorithms that quantum computers can't break in any sort of real time - they already exist now, right? You know, type of thing. The fact is, you know, making those transitions and getting that stuff pulled together- how do you put a backdoor into a quantum algorithm? Can you put a backdoor into a quantum algorithm? Interesting question. I don't know the answer to it, you know, type of thing. So the technology itself is racing along. It is not- if it was a technology that was solely the purview of the United States, yeah, maybe they could make something work effectively here. But when you have a global access to technology like- technology like encryption. In fact, almost any grad student in cybersecurity and mathematics can write their own encryption software, right?
Gus: What do you do? Right? You know. So...
Gus: Yeah, so it's just, you know: Great idea. I understand. But let's look for alternatives. Okay?
Paul: Yeah. It's good to hear you say that. I just wanted to jump back, as we talk about encryption. And we're a cloud service, which is fully end-to-end cloud service encryption. Not here to plug it, of course... but you mentioned, when we were chatting earlier, that you were responsible for introducing the cloud into the Central Intelligence Agency, and making that shift in- maybe you can- obviously, tell us what you can tell us about- about the decision to do that. What were the benefits of that? Because, obviously, ongoing, all the time, some of the conversations that I have with customers is: Can there really- even still today, can they really trust cloud?
Paul: Can they really trust putting their information with a third-party provider? Is- you know, are we going to get the levels of service that we need? Is it not going to go missing? You're not going to take the service off?
Paul: All of those questions are valid. But what's your feeling and experience and insight with that?
Gus: Yeah. Well, so all of those questions you just asked are also questions they need to ask if they're running their own data center. Okay?
Gus: And so the advantage of the cloud was- you know, there are a couple of things. One was- is that the cost of running your own systems, and the ability to react effectively to global events, okay, is highly constrained in your own infrastructure environment, right? So one of the beauties of the cloud was the fact that if I need capacity on demand, that was one of the goals, I can scale up and scale down, and only pay for what I use at the moment in time, right? So I can- I have now on-demand processing that allows me to effectively deal with and react to something that's happening in the world. Okay? That was one. Two was the fact that the velocity of innovation within the cloud environment is just extraordinary. And it's really important for companies to understand how can they really take advantage of that? And as, you know, the- as the cloud has matured, the ability for people just to use those services and bring them to bear is almost instantaneous today. Right? I don't have to have whole teams of... you know, software writers recoding me yet my own, you know, big data analytics platform. I can just leverage the big data analytics platform that's there. Okay? Third, I think the- looking at the cloud, they have an ex- they- to them, security is an extraordinarily important... capability. Because simply: If they would have any form of a breach or data loss, which was the result of something that was systemic throughout their entire environment, right? So in other words: Somebody could get in underneath the control point, and then come up and attack anybody anywhere, and steal their data and things like that, that would be an existential, you know, end-of-business event for a lot of these folks. Right? People would just flee like mad, you know, to get off there. So they focus extraordinarily on it, and their security practices from our analyses, when I was looking at it, were really, really, really good. And they continue to be excellent and better. And they invest enormously to avoid an event like we just described. I'm not saying it can't possibly happen, but it's theory. And then finally, the fear over your data... what most people don't realize is that in the cloud, you have control of your data. Right? You encrypt it, you hang on to the keys, you don't- you know, don't- you hang on to your own, you know, HSMs, and just make sure that you control your data. And then finally, the cloud is not the be-all/end-all answer. Right? It's always going to be some form of a hybrid model. And if there are things that are so super sensitive that you're uncomfortable putting in the cloud, keep them in your data center. But the vast majority of your operations and your capability are likely well-suited to operate and run in the cloud for all the previous reasons. You know? Agility, velocity, capacity and demand. You know? All those things like that. So it's a- you know, those were all the considerations. The same considerations when I was looking at it back when I worked at the agency are the same considerations we looked at, and a business needs to consider and look today as they move on ahead. Okay?
Paul: Yeah. And I think what you'd- as we'd gone back and mentioned earlier about the space and the government investment in that, and now, how it's turned into commercial companies. And we had that point where you mentioned that basically, anybody with cloud can literally jump on, and start up their own business, and, you know, do very well for themselves.
Gus: Yeah! Right.
Paul: Without having to procure any of their own infrastructure. Do you think now, with the COVID situation, everybody of course- we're seeing a massive change because of the cause of it, but I think there is also some great possibilities to come out of it for businesses to restart afresh, renew and bring them- bring their technological or- bring technology into their business processes a lot faster, a lot quicker, by using cloud.
Gus: Yes! Absolutely. Again, it doesn't matter what it is, the fact is that now you can get into the cloud, and begin and do development, and do it very quickly, and for very cheaply, rather than having to stand up and enter an entire other environment, right? So figure out what goes on, have that gr- right? So what's the famous saying? Start fast, fail fast, succeed faster. Right? Okay. So... And basically, that's what it allows you to be able to do, right? You can do very rapid innovation, very rapid trial and error, come to an outcome, and then continue to mature and grow that, you know, as time goes on, extremely quickly. And as it matures and becomes a more mature thing, maybe you want to go back and re-assess. Okay, do I keep it in the cloud here, or do I move that to my own data center? Right? But in your own data center, I've got to get capacity, I've got to- you know, it's a long haul to be able to do these things. I've got to- all these roadblocks are in my way to be able to do these things, and so, you know, that innovation speed is just something that is, I think, so key, and the key attribute that the cloud brought and enabled in what's going on in technology today.
Paul: Yeah, and I think, as you mentioned earlier, I mean, for those people that are aspiring to start their own businesses or even transition over... you know, I spoke to a few pharma companies recently that we've onboarded as customers, and they're exclusively cloud.
Paul: You know, they've got no infrastructure what- themselves whatsoever. They've got labs, but everything's in the cloud, and they're dispersed across- around the globe. They work in different laboratories around the world, and they're collaborating. And they're making the best use of the cloud environment, as you said. And if they need to scale up, they can scale up. If they need to scale back down, they can scale down. So that flexibility is there.
Gus: Yeah. It's- it allows business to focus on their core- what's core to the business. And not so much on all these ancillary things, you know, on the side and things like that. So... right. So... yeah.
Paul: I just wanted to come on, because we were talking about- we'd obviously been talking about security on-premise, some of the reasons why CIA decided to take a cloud- adopt the cloud and various sort of things. And I think when we think about let's say malicious actors, as we call them - maybe there's different terminologies for them, whether they're a state, whether they're individuals. We always think of them being external to the organization, so they're somebody from the outside that's trying to get into our environment, trying to, you know, take information from us. But they're- of course, there is always that threat of the person inside, right?
Gus: Yeah. The insider threat, yeah.
Paul: So the insider threat. And one of the things that has come up in the discussion was about the whistleblower incidents that we've had over the years. And what's your opinion and your thoughts around that? I mean, of course, a lot of information came out. Some of it was very, very highly sensitive. But how do you see that from being a former CIA Cybersecurity Director in these things?
Gus: Yeah, so I'll tell you. So two things: One, let's start with what's important, which is the Whistleblower Act in the US and the ability for whistleblowers to call foul, get listened to, and we'll follow through through the IG processes, that is absolutely essential. Right? Because it helps to prevent malfeasance on the part of organizations, whether intentional or by well-meaning individuals that have just sort of veered off the path, or whatever it's going to be. So the whistleblower, and the Whistleblower Act itself, and whistleblower protections, are absolutely critical. Okay? But the references you're making, like the Vault 7 things and things like that, that wasn't a whistleblower. That was somebody who was intentionally, I think, trying to do damage. Because you don't need to steal all the information, and post it out for everybody to see in order to blow the whistle, okay? That's just not how you go about- you do things. Right? Okay? And so for those people, that's really- that's tantamount to, you know, somebody going into, you know, Coca-Cola, stealing their secret formula and posting it outside, because, you know, they just- they said that, "Well, it's unfair that Coke has a market advantage, and I don't think that's fair. And it's anti-competitive, we're going to post their formula." Right? You know, okay. So great. If you think Coke has done something wrong, I- you know, I'm trying to think of a good example here... you know, you- there's ways you can deal with that. Okay? But you don't have to damage the organization itself in order to do these things. And that's what the stealing and theft of information had done. Right? It was an unnecessary action in order to blow the whistle. And I think it was done for other reasons. Not as- for the punitive claims of being a whistleblower. So.
Paul: Yeah. And I think, you know, we talk about these high profile incidents, but you've consulted for companies when you worked with Accenture, and now you're doing your own consultancy work as well. There is always a need to protect that information internally.
Gus: Oh, absolutely!
Paul: And the measures that you put in. So the lessons learnt transfer into business, even into your own personal life as well, about two-factor-authentication, about password protections. All of these different things that need to be accompanied of facilitated with, and I always also raise to my customers as well, is that, "Look, you know, don't think about the external threat all the time. You need to take precautions and measures about the internal threat, about where you can classify data, who can get access to it. You know, what's the leakage of data? What the consequences could be." So, I mean, that's really- you know, that threat does exist there. And choosing your technologies is important.
Gus: Yeah, it really is. I'll tell you that from a cybersecurity perspective, you know. My view is that the adversary, the bad guy, they only want two things. Right? Okay. They want your data. They want to steal it, they want to destroy it. You know, ransomware.
Gus: They want to- whatever it's going to be, you know, that they want to do with it. They want the data. Okay. And they want control. Right? So I want to implant something into your environment that allows me to take over your networks and your systems and that, when it's to my advantage to be able to do so. (inconclusive) Right? That's what the adversaries really want. Okay? Those two sets of actions. And so I really think as companies, and governments, and everybody else thinks about it, we need to think in terms of what I call "data-centric security". In other words: Think about how am I going to protect the data first, and work my way out, in the scheme of security protections. Not the moat and castle model that had typically emerged, you know, from a few years ago, where I try to prevent people from getting in and doing it. But once they get in- I call it once they get in, it's a "soft, chewy middle". Right? So I can run amok, and do everything that I want once I finally got in there. Right? So you have to think about how do you build your systems for the future, that harden it from the data out. Alright? And make it- and make it (inconclusive). I'm- all these other- the other mechanisms to prevent from getting in are extremely important. But if I haven't got the data hard on the inside, once they get in, you know, that's- you know. All the major data losses have resulted from that type of environment, and how that was built. So that's what people have to begin to think about. Okay? Data-centric security is where they need to really focus their energies, I think.
Paul: Yeah. Yeah. I read an article, I think yesterday or the day before, as well that the Central Bank of New Zealand had a breach with another third-party application, file hosting application. And then, potentially, the blame game starts about: Was it a human error? Was it actually a patch? Was it- it's a minefield, right? Who knows?
Gus: Yeah, exactly. Yeah.
Paul: But when these things happen-
Gus: But imagine having hardened your data, so that if that were to have occurred, it would have made it much more difficult for anybody to take advantage of the information itself. Right? So. So if the only data that I can exfiltrate from an organization is encrypted, okay - 'cause that's the only way I can get it out of the organization - okay, great. What use is it? When I don't have the keys and other things like that. You know what I'm saying? So... so, you know, there is no absolute security. Absolute security is absolutely impossible. It's one of my other favorite sayings. But the reality is that when you think about it, and you design, understanding that the adversaries' objective is your data, I think that changes a lot of the approaches that people need to take, and how you build and design systems for the future. I mean, ultimately, this is a lot about what zero trust is kind of all about. Right? It says, "Okay, how can I minimize Paul's access to only these systems and only this amount of data?" So if you were a bad guy, inside or outside or..., the only damage you can do is here. Okay? He can't do anything else over here, or that type of thing. And so those types of models have got to mature and get much tougher. And going back to our encryption discussion, strong encryption is a critical part of data protection. Absolutely, without fail. Okay? Fact. All data should be stored encrypted, and move encrypted at all times. Right? So...
Paul: Yeah, yeah. And I know that- do know a lot of companies try and put extra measures and controls on data for classification... I mean, it's been around for a long time, but...
Paul: You know, it's a bit of a mix and match.
Gus: It's hard.
Paul: It's hard. It's not easy. You know, how do we classify certain documents so that people can access it, where it doesn't interrupt their workflows. They don't need to go and get permissions for it it, it needs to be released, or whatever that may be. These internal processes. How can me make it streamlined and smooth, but still, you know, label the kind of level of security we want, for certain levels of classification.
Paul: So... it's certainly a challenge for us, going forward as well. Trying to understand how we can help in that area. But, yeah, it's an interesting field. And I think, just to wrap up, 'cause I think we got like 5 to 10 minutes left.
Gus: Yeah, sure.
Paul: But how have you seen the COVID situation evolve? I know it's- everyone's talking about it, but, okay, there's negative sides. But as we mentioned earlier, there's positives sides coming out of that as well. What's your own feeling, the situation, coming out of that? And how has it impacted you? And what do you look for in the future in 2021? I mean, you're doing your own consultancy thing. So... how is that working for you?
Gus: Yeah, so I would say that the impacts of COVID have been- they have been profound. You know this, right? And everybody. I think that it's been particularly difficult on healthcare workers and things like that, as we know. That's the direct impact. But it's also really, really hard on families with children at home. Right? When they can't go to school, and they can't go to the park, and they can't do this, and they can't do that. You know, it really makes it very difficult for people to both look after the kids in their house and do their jobs, 'cause they can't go to work, and so they all have to work from home, and things like that. So I think that it has been really tough on them. I think there will be some really long-lasting effects from this, around: Do we really need to- you've seen the articles. Wall Street Journal had a big section on it a couple of weeks ago, talking about, you know: What's the future of commercial real estate? Right?
Gus: So, you know, and: What's the future of work as the result of COVID? So many people have found that: I can do this at home. Like you and I are chatting here on the- you know, on Zoom right now. That- do they really need to be in the office to get their work done? And what has happened is that remote work was already taking off. You know, particularly here in the US, right?
Gus: And I'm assuming in Europe and other places in the world, too. And all this did was accelerate what was a 5-year trend, and compress that down into a one- to two-year trend, right? It just said, "Alright, this is real. And how do we make this work? And how do we- how does technology enable this, you know, to happen?" And so, you'll see some big effects, I think, on real estate, which means you'll have big effects on roads, big effects on, you know, parking. You know. Do we really need all those things like that? You know, those are the things that are going to have some really long-lasting outcomes. I'm hoping we'll get back to school in person everywhere. Simply because I'm not sure remote learning works as well as people would like it to. I think learning is much more effective in a classroom environment. In fact, I think there's a lot of evidence to support that. You know, that perspective. You know, and those things. And then, for me, as we talked earlier before we got kicked off, I'm actually looking forward to being able to travel again. Right?
Gus: Go some place. But travel will be affected. I don't know- when I was growing up, you had to travel with your passport and your little yellow shot record, right? So whenever you went anywhere around the globe, you had to present your shot record as well as your passport to the immigration authorities, so they could validate that you'd had all the immunizations that were required by the country you were coming into. And I think you'll see that coming back. In fact, I know that even in Europe, they're already talking about it, right? Instituting that you have to carry whether or not you've been immunized by- you know, got your COVID vaccine and kind of things like that. And I think you'll see that just come back into the fore. So people want the confidence that when you are traveling, you have all the appropriate necessary, you know, vaccinations. And then that will begin to open up travel. Right? I wish the immunizations were going faster. That would, I think, build a lot of confidence across the board. I'm very worried about the- about the over- too many people are fearful of the immunizations for- because, you know, bad news sells much better than no news. Which is everything is great, you know. So... The fact that four people out of how many millions have had an allergic reaction on the entire planet, is... but, you know. The only- all I hear is, you know, allergic reactions. Things like that. And so I'm hopeful that we can get vaccines distributed, people immunized. And then I'm also hopeful that this enables us to have- leverage that into immunization protection for a lot of other things that previously we were having difficulty dealing with as a society, to help us go from there. So...
Gus: Long-winded answer to your question. But, you know...
Paul: No! It was good! And I- because it gave me a bit of food for thought. To pick something out there, which is- it's been noticeable to me, is that we talked- I just mentioned earlier about pharma companies.
Paul: But I think what COVID's done- it's actually flushed out a lot of deficiencies in healthcare systems around the world. About the way that they handle data, and deal with data, and things like that. And I think here is a chance for them to advance or review the way that they can collaborate, potentially use cloud technology, for example.
Gus: Oh, right! Ideal for this, right? In fact, many of them were doing that, precisely, to do their analytics. Yeah, so... Yep. So and in fact, that's where the power comes in. The intersection of these enormous- the valuable technologies to solve human problems, right?
Gus: Okay. That's the power of all this technology that's emerged now. Right? So.
Gus: That's important.
Paul: Fantastic. Okay, Gus, thank you very much for coming on the show today! I really appreciate it, you taking the time out.
Gus: Yeah. And Paul, thank you!
Paul: And that is all for today's episode of "under CTRL". You can find links to all our social platforms and to our guest in the episode description. If you like the show, make sure you'll subscribe and leave a review. Join me again in two weeks' time for the next episode.