Troy Hunt: password security is the name of the game
In the history of data breaches, the headline-making hacks– like the Adobe data breach with 150M compromised records, or the LinkedIn cyber-attack with +100M leaked credentials – are just the tip of the iceberg. Every day, huge amounts of data is replicated and circulated in the net, changing hands in online transactions across both the clean and dark web.
Whilst the latter has had a reputation for obscure password trading since the term ‘dark web’ was coined, the clean web is now becoming more and more of a danger. We’ll be elaborating on the reasons as we investigate the importance of password security in our next podcast episode.
We’re delighted to have the honour of welcoming Have I Been Pwned founder Troy Hunt (and, in a special guest appearance, our CEO and co-founder Istvan Lam) to under CTRL this week. Enthusiastic traveler, blogger, keynote speaker – Troy wears all of these hats throughout our conversation as he walks us through the site’s origin story, mission statement and the data that powers the now iconic ‘Oh no – pwned!’ search result.
Here’s an overview of what to expect:
- The story of Collection #1 to #5, the aggregated list of email addresses and password pairs from previous data breaches; and why Troy believes that such credential stuffing methods are today’s biggest threats
- An explanation of encryption keys from Istvan Lam
- What Troy describes as the ’perfect storm’ responsible for the upswing and severity of data breaches
- An overview of the lifecycle of data breaches and the rationals behind keeping them public or private
- Advice on whether to pay out ransoms, and where the responsibility for protecting customers’ data lies
If you’ve enjoyed listening to this episode, check out our previous episode with security as a service provider White Shark – and stay tuned for more under CTRL podcasts on Spotify. You can also stay connected with all things Tresorit through our official Twitter and LinkedIn channels.
Paul: Hi everyone! Welcome to the 15th episode of "under CTRL". I'm Paul Bartlett, and on today's special episode I'll talk with Troy Hunt, the founder of Have I Been Pwned, and also István Lám, CEO of Tresorit, who is the cryptography expert. We will discuss how the aggregated data site Have I Been Pwned gives insight about the magnitude of data breaches. Also how hackers trade passwords on both the dark and clean web. And what measures we can take to prevent personal data theft. Stay with me to uncover the answers in this episode. Good morning, Troy! Good morning, István!
Troy: Good day, guys! Good evening, I think I can call it here now.
Paul: Yeah, good evening! Certainly! Welcome to the show. And István: Good morning to you then! Welcome to the Tresorit podcast.
István: Thank you! Good morning!
Paul: Okay. Thanks a lot for joining the show this morning! It's great to have you both on here. And we certainly got some interesting topics to cover. I think first of all, what I'd like to do is obviously ask you both to give a short introduction about yourselves, and some background also. It's always interesting to hear the stories about how you got involved in security and software. Troy, over to you.
Troy: Alright. So... background. I started building software for the internet, let me see now, probably 1995, early days of the web. Went and had a whole bunch of like dotcom work. Worked for a big multinational, figured that I don't like working for big multinationals. Realized a lot of people were writing really bad software for big multinationals. And then got really interested in security as a result of that. And went on to being an independent person. As of- bunch of- I used to sort of travel around the world and do training and conference talks. Now I just do training and conference talks, and I run Have I Been Pwned as well.
Paul: Right. Okay. Great. István?
István: I've been involved in security from the cryptography part. My background: I'm a cryptography engineer, by education. And my thesis at university was about cloud encryption. And that later evolved and became a start-up called Tresorit. And now we are- I am the CEO of Tresorit, and I am- have been an engineer and doing mostly business. But I'm deeply involved in- into the cryptodesign, as for today as well.
Paul: Okay. So we're going to cover some of those topics today. So first of all, as I've got you both on the show, I'm sure you've got a lot to add about the security aspect of what you've seen and what you've experienced, where the flaws are. And the first topic I wanted to kick off ourselves with is about Have You Been Pwned. So Troy, maybe you can tell us a little bit about this web page, and a little bit more why you brought it to light? What created it for you? What was the inspiration?
Troy: Well, the first thing I'm doing is, I'm just checking if anyone has registered haveyoubeenpwned.com, 'cause that's not the right website. Yes, I registered it, cool! And it redirects to haveibeenpwned.com, which is the correct website. Thank you.
Paul: Right, okay. Thank you.
Troy: And I knew there was a reason for squatting on all those domains. So... Have I Been Pwned - oh, I'm sitting on Have I Been Prawned as well, just in case.
Troy: Because that happens every now and then. And also Have I Been Porned, which I believe is a totally different thing. But anyway... that goes to this website, too.
Troy: So... Have I Been Pwned is data breach aggregat- I could keep going, there's so many of them. It's a data breach aggregation service that takes data that has been breached out of somewhere - usually data which then starts circulating around the web - aggregates it into one place and makes it searchable. And in simple terms, it's just designed for people to figure out where they've been pawned in a data breach.
Paul: Okay. Okay. And when did you start this? How long ago- was it- you know, how long has it been active?
Troy: Well, I'm actually just about to hit the seventh birthday.
Troy: So I think it was early December 2013.
Paul: And your motivations? What did you see when you first started with- was it something where you just saw something which was blatantly obvious, where you needed to bring this web page to fruition?
Troy: Well, the catalyst was the Adobe data breach. So I mean, leading up to that, there'd been a bunch of different data breaches.
Troy: I was writing blog posts on such insightful things as "Isn't it interesting how the same people who appear in multiple data breaches have the same password?" Which when you look- I mean, I thought it was interesting. I'd take things like the Gawker data breach and compare it to the Stratford data breach and, you know, there's an amazing amount of crossover there. And... look, when Adobe hit, when it was like 150 million plus records, and of a more personal nature to me, it was two of my own records. It was both my work account and my personal account. And then the thing that really struck me with Adobe was: It was not an organization I knowingly had a relationship with. And the penny dropped later on that because I was a big Macromedia Dreamweaver user back in the day, I had a Macromedia account. And of course, Adobe acquired Macromedia, and my data then flowed to somewhere else that I'd never expected it to appear.
Paul: Right. That sounds familiar in these days. So... is it a solution? Or is just a source of information? The web page, when you look at it, you put in- I suppose it's- you enter your credentials. From what I can see there's like a search bar there. And what can it tell you?
Troy: Well, you enter your email address. And what it will tell you, based on your email address, is the places that I have seen your data circulating.
Troy: Now, is it a solution? Depends how you look at it. There will be data breaches that are not in there, but you are probably in. Just by virtue of the fact that I can only index what I can obtain. And I can only obtain what we know of. I mean, there are so many different incidents that have happened that we simply don't know of.
Troy: So from an informational perspective, it's part of the picture. But I think that the more important thing is what do we want to happen as a result of this? So, for example, I would like people to change behaviors and use a password manager. Now that- that then this becomes part of the solution. This is why I've literally got the 1Password password manager sitting on the front page here. I'd like people to have that penny-drop-moment, where they go, "Ah! My password has been exposed in..." what's up here on the top here somewhere, let's say the Chowbus data breach. Actually, I don't even know if they have passwords. Anyway. One of the websites that have passwords. "And now, as a result of that, I am more conscious of my security, and my solution is that I'm going to get a password manager and have strong unique passwords everywhere." So that's what I'd really like to see.
Paul: Okay. I'm just going to bring István in here, about passwords. You've- as from a cryptographer's perspective, what is it that you also see or have seen in your time, around, you know, user behavior or the- you know, the lack of security, or the way- just the way that people tend to use common passwords or... where's the issue here?
István: There's two issues. One issue is, I think Troy knows it very well, that even if people choosing strong passwords, they're reusing it. So when you are a human being, you cannot really remember strong passwords, hundreds of strong passwords. Just face it. And if you're reusing it, basically, now there's- if there are, you know, data breaches everywhere, one- only one system needs to be breached, and then all your accounts will be breached. Because that will leak your password most probably, and then it goes out to the world. The second very important aspect is the weakness of the passwords. The weakness, when it comes to, you know, how many guesses you need to make randomly to find out that password? And when it comes to encryption, we're- you may heard the, you know, 256 and 128 bit encryption. It basically talks about the number of variations you need to- or the inter-variation of the keys used for encryption, to the power of 128. But when it comes to passwords, it's way, way, way lower, in terms of variation, potential variations. It's so much lower than I would say that for instance- let's say you have a human hair. And that's one- that's the human-generated password, and the strong encryption key is the diameter of two million universes. So you have one universe, which is big compared to the human hair, but two million of them is even bigger. And it's still- that's the ratio between the potentially good passwords, or potentially good keys - it's not passwords - and a human-generated password. So from- cryptographically speaking, it's weak. I mean, it's really, really weak when it's a human password.
Paul: Okay. Good. And so I suppose my question to both of you then, because I am one of the people that's just lone of these frontline users, and I'm guilty of using the same password, or have been guilty of using the same password. Since I came to Tresorit, things have changed, and we use password managers and things like that. But in the past, I've been that person that's been out there. And it's just easy to remember, right? It's just like- and it's probably a combination of things around me. Whether it's door numbers, parents' names or kids' names, whatever that may be. So what is it that- and I think before we started the show, István, you mentioned, it was like: There's the dark web out there. What is it that people are doing? How is it working? I want to try to understand more about what these hackers are up to. And maybe Troy can jump in here as well, 'cause he's- also understands a little bit more about where these sources are coming from and, you know, what are they doing? What technologies they are using to break the passwords.
Troy: I think the thing that's most concerning at the moment - it's a sort of two-part answer - is... I'm less concerned about people brute-forcing individual passwords, and more concerned about the mass usage of credential stuffing lists with combinations of username and password. So credential stuffing is just something which is massively rampant at the moment. Due to the mass ubiquity of username-password pairs from previous data breaches. So I actually had to do a presentation only a few hours ago. I showed the collections 1 through 5. So January last year, there began circulating, very broadly, what was referred to as collection 1, which was 1.1 billion unique combinations of emails address and password pairs, including my email address and terrible password, which I had previously used in the past.
Troy: Now inevitably, these were taken from a data breach where they were either stored in plain text, or weakly hashed and the cracked. And collection 1 was then followed by 2, 3, 4, 5, and that was about 15%. The 1.1 billion records was about 15% of the whole lot. So what concerns me most is people take this massive list, and then due to the prevalence of password reuse, they simply throw those credential pairs at authentication pages or authentication APIs behind apps, and see which ones work. And I- honestly, I chuckle a little bit when I hear people talking about the dark web, because people are so worried about the dark web, I sort of say to them, "You know what's even worse than the dark web? It's the clear web." Because so much of this data is sitting there on the clear web, really easily accessible. And in fact, the talk earlier today showed a tweet with collections 1 through 5, links to Mega.nz accounts. So- Mega is great file sharing platform, but it makes it very, very easy for someone to massively distri- redistribute very large amounts of data really quickly. And I say the clear web is much worse than the dark web, because of how quickly and prevalently data will spread, once it's shared. So that's what I worry about most. That's what I think the biggest threat we're facing with passwords is at the moment.
István: Yeah, I mean. It's- I think the importance here to highlight: Yes, you are talking about the clear web and the dark web. But what I realize talking to people who are not in the security that they don't know about the eco system behind, you know, the security breach. So that if, you know- there is a value of one's accounts. And Troy, you collected these databases and you use for good. But there are people out there who want to use for bad. And how much- I mean, how much was it to get your databases? Like an hour searching and nothing?
Troy: It's easier now, 'cause I don't do anything. Like people just pop up and send it to me. So... in the very early days, like I had to go out and find data. Like the really, really early days, as in the first few weeks. And then after that, most of the time, people pop up. And what blows me away, like I'm looking over here at my email at the moment, there are so many inbound emails where people are like, "Here's a hundred thousand accounts." Here's someone- here: "Here's 200 GB worth of data on Mega." And in fact interestingly, that 200 GB worth of data, which came through only earlier today, is again all of these credential stuffing lists. So someone's like, "Here's 200 GB of email address and password pairs." Which is just- it still boggles my mind no matter how much I see it.
István: But what is the source? I mean, yes, you got it in the email. But what's your guess how that person, who sent it to you, got it? I'm pretty sure that's not the person who br- who hacked into the system.
Troy: Yeah. So my guess is that they have got a friend, a "friend" - I'm air-quoting for people who's listening - who's passed it on, or they have aggregated it themselves from different sources. One of the things that I actually think is really fascinating, and I'd love someone who's a- I don't know, like a psychologist or something like that, who has an insight into people's minds, but I find it fascinating how much data spreads just from people who want to hoard it. And I see this all the time, when people are like, you know, "Here's a directory listing of all of the data breaches I have." And you sort of go like, "Well, why do you have this?" And very often it's- it's either children or very young adults, they're not necessarily monetizing it, they're not necessarily doing anything with it. They collect it. And in fact sometimes, I'd ask people, and they're like, "Well, you know, you never know when one of my friends is going to be in there. And I want to target them or something like that." But it's just fascinating, the way this data replicates. And then- I've likened it in the past to- it's a bit like I imagine kids with baseball cards. Maybe kids in America with baseball cards, you know, like swap them with each other. But unlike the baseball card, when you swap a card, you no longer have the original one. But when you swap data, it replicates, and you just get more and more copies of the same thing.
István: And you mention monetizing. I mean, and here, what I think is important for now there's hacking is- it's not like if you want to make someone's life harder, it's- you need to hire a hacker. You can buy it. And there's not a big price tag for an accounts, right?
Troy: Yeah. And look, I mean, we have seen multiple sites very similar to Have I Been Pwned in nature, in so far as you enter an email address, and then you get results for data breaches. But they have gone so much further, where they've said, "Hey, you know, you just search for like, I don't know, email@example.com. There is a result. Hey! Like 3 dollars fifty cents or something like that. And then you'll get all of the data about firstname.lastname@example.org." So we had services like LeakedSource, a few years ago. That got shut down. I kind of laughed when it got- well, I laughed for many reasons, but one of the reasons I laughed is that the guy who was running it... now, you would think that if you're running what is essentially an identity theft trading service, you try and fly under the radar. And of course, once the guy got caught and he's all over the tech articles, the dude is like twenty-four years old, he's driving around in a bright green convertible Lamborghini. So if you're out there, thinking you're going to run like a massive data identity theft trading scheme, don't drive around in a green Lamborghini. Worst idea. So there was LeakedSource. Only, I think it was last year, WeLeakInfo got taken down as well. And I think the key term in both of those being "leak" is probably a bit of an indication of their morals around how they handled this data.
Paul: Yeah. And I just wanted to come on to that, because you testified in front of Congress I think three years ago, in 2017, where you said - where's the quote - that "we created the perfect storm". Could you give us a little bit more elaboration on that? Where do you think this is- this perfect storm is caused? What's- (inconclusive)
Troy: Yeah, well... it's really the factors, right? So the different factors that contribute to the spread of data breaches. Now, some of them are sort of very easy. There's more people on the internet. You know? Like there are more people out there to begin with. There are more websites. So that's a problem as well. And then what we're doing is: We're collecting lots of data that we never had digitized before. So I've written in the past around things like kids' tracking watches. So there are applications that will track your children as they walk around with a GPS-enabled watch. And in a very self-fulfilling prophecy, only last year, my daughter's tracking watch, which I only got to point out the flaws in it to begin with, had her account data leaked due to a vulnerability in their application. So we're collecting data like that. We're collecting data on... my washing machine is connected. So that your washing cycles, which are not deeply personal to me, if I'm honest, but my washing cycles are somewhere there in Samsung's cloud. There are other devices connected to the internet now that are deeply personal, and we won't go into those details here. But - use your imagination! So the point is we're collecting all of those classes of data, we're exposing it to a lot of different- a lot of different devices that just never had internet connectivity before. And then we've got cloud, which is fantastic. Because you can do so much stuff with cloud. And you can do it so quickly and cheaply. And you can screw stuff up so quickly and cheaply as well. So it's great for like democratizing access to compute services, but it also makes it extremely easy to have data breaches. And then like the last thing to make it even worse is that we're wanting to interact more and more between different services and share more and more data. How many applications do you have that want to access your contacts? Or get access to your inbox in order to connect you with friends on a social network? And then another layer on top of that is the whole social layer of- we now look at people like say my son, who is eleven years old, and is getting much more sort of online and interested in that sort of thing. He's never known a time where you don't extensively share personal information. So the social norms are changing as well. So that to me is what creates a perfect storm.
Paul: István? Do you have any feelings about that, too?
István: Sure. I mean it's- we are collecting so many data and putting it in one place or a few places. That's- you know, these data centers. Usually, homogeneous systems. So they're the same type of systems. And basically, how I see this, homogeneous systems are like you collect a lot of, you know, wood, and put it in one place, and you put petrol on top of it. And yeah, it will not light up itself, for sure, but it's- there's now a very big chance that at some point even a lightning will strike and will light it up. Because it's- it's fragile. Really fragile. And that, what I see- especially Troy mentions cloud, and what was before that different systems and different providers built their system up on separated ways.
István: But when it comes to, you know, Facebook has 2 billion u- data. Yes, they have a big amount of- a huge budget for security and they're trying to protect that 2 billion data records. But at the same time, they're good prey. I mean, they are a good target. They are- just imagine these large datasets. It's worth to hack. It's worth to invest into the hacking more. And then it's- and then it comes to the accidental and also hacking what usually the data breaches in mind is associated with. But there are willfulness in this, and big budgets behind of the- this data breaches. When people are attacking certain services, just to get data. Just to get that information. And they may not put it on the web, because they invested so much to get it out. So what is I think really, really- this is a tip of the iceberg. So when we're talking about the known data breaches, this is really just the tip of the iceberg. Because in the background there are- there's a big hack- there are big hackings happening all the time, and it's not on the news. It's not uploaded to the clear web. Or the dark web either. It's- there's someone who wanted- wants that data, and no one wants to talk about it.
Paul: Yeah, I just to bring up on that point, because yesterday, I read an article, which unfortunately was- I think it was a clinical data breach. And it was therapeutic patients, basically going back as young as when they were kids, with this information being collected by hand, but then uploaded into a database. Obviously not in a very secure format. And then somebody's got this data and now is starting to blackmail those individuals, with asking for Bitcoin payment. But the interesting part for me was what you just mentioned there, is that these breaches supposedly happened three years before, two years before. This data was being siphoned off. So... I mean that is- when we think about data breaches, we're assuming that this is something that's happened quite recently. Right? But in fact, it's happened like years before. The actual realization that there's been a data breach can take so long. Why is that? Why do you think that, both, yourself and Troy, why is it so long for them people to recognize that they've been- they had a data breach? Or they've lost information. Especially personal information.
Troy: You got to remember: This is not at all uncommon. The LinkedIn data breach happened in 2012 and took until 2016 to come to light. Same with Dropbox as well. MySpace was even longer than that. In fact, we're not even entirely sure when the MySpace one actually happened. We know it came out in about 2016. So this isn't unusual. I think the interesting question is first of all: What is the rationale for keeping it disclosed? And then: What is the rationale for then leaking it publically? Now in some cases inevitably the data within those breaches is being exploited. And it is more exploitable whilst the organization is unaware of the incident, and whilst the individuals are unaware of them. That LinkedIn incident had passwords stored as SHA1 hashes. So that 99% of those have already been cracked. So that's a really rich dataset of usable credential pairs, and not necessarily for LinkedIn, but for all sorts of other places. Now I would imagine that whoever had that data- in fact, the guy has been picked up, which is good news. I think he got extradited recently. That does provide a long window of opportunity to exploit the users in there. So that would be a good explanation for why keeping it private. The question then in terms of why making it public: Well, it was originally sold on the "dark web" - I'm air-quoting this time. It was monetized. So there was a reason to actually cash in there. Maybe it didn't provide enough value whilst being held privately anymore, and making it public managed to get the guys some Bitcoin. And then, of course, once it goes public, eventually it starts circulating, and then the value of it goes down to zero, which is where it is now. So that that could be an explanation for this particular situation here as well.
István: What I think is simply this: I can guess, but someone who did the hacking sold it like a wholesale trading to someone else. And that someone else, as Troy mentioned, tried to sell it to someone. So the trading happens, and keeping it private, keeping it behind closed doors. And at some point it got to people, or the people who already had it, and they wanted to monetize it on a wider range. And they were just hungry for more money.
István: And that's- and when it happened, it turned out. I mean it's- it's a risk. I mean, if you try to sell it in bulk, then you don't get that price then. If you sell it in small pieces, then you get a higher price for the data. But the chances that someone will not notice and notify other people is higher. So that they went down that road, most probably, here. And the reason why, in general, I think this underpins my previous statement that most of the data breaches are still uncovered and still- we don't know about it, because it doesn't come to the light, because of this sort of retail selling of data.
Troy: I mean, one thing that might be relevant in this case - and as I was saying before, I've got an inbox full of data breaches. A couple of different people have emailed me this one. So the one we're talking about, I believe, it's this Finnish psychotherapy center called Vastaamo.
Troy: Now, at the moment, this is available on a Tor hidden service, where the folks here are saying, "We have hacked the psychotherapy clinic Vastaamo and taken tens of thousands of patient records, including extremely sensitive session notes and social security numbers. We requested a small payment of 40 Bitcoins...", which I've just checked, and it's about 524,000 US Dollars at the moment. In brackets. Nothing for a company with yearly revenues close to 20 million Euros. "But the CEO has stopped responding to our emails. We are now starting to gradually release the patient records, 100 entries a day." And then there they are.
Troy: So inevitably in this case, it's- it appears to be financially motivated. Now who knows how long they've spent trying to get their 40 Bitcoin before the CEO stopped replying and... yeah, I guess now, they're hoping for money, maybe? That I don't think they're going to get if they just dump all the data either.
Paul: Yeah. But is that not- I mean, it's quite distressing for those patients involved, okay. Because they put a lot of trust into that organization to be able to protect that data. And what I'm seeing here from the report is that, as you mentioned, it was 40 Bitcoin, which supposedly was refused to be paid. And then they approached the individuals concerned, asking for 200 Euros, or 180 Pounds equivalent of Bitcoin, for- to stop that information being... So it seems that we're in that... cycle very much now where it's more predominantly visible now that this blackmailing and... is going on. Whether it's ransomware attacks of encrypting data, of getting data and then releasing it by malicious actors.
Troy: And it's not unprecedented, alright. We saw this with Ashley Madison. With individuals who are members of that service being blackmailed. And it's- look, I mean, we know from Ashley Madison, too, that it led to suicide. So it's just- it's enormously sad and serious, and it's... you can only imagine that if those folks do get caught, it's probably not going to end up very well for them. You'd hope not anyway.
Paul: Yeah. And so that would lead me to ask is, I mean, exposing data by design is something that you mentioned, but also, I mean, where's- there's a lack of accountability. You've mentioned this, Troy, before. Where does that accountability lie? Is that with the organization that should be responsible now for making sure that that information is obviously secured? Should they pay the ransom? Should- how does it- how do you feel with that? Maybe István as well.
Troy: Well, I think there's multiple ways of looking at it. So... yeah, what can the organization do? Data is already being leaked. It's a little bit of that old analogy of trying to remove your data from the web is like trying to remove pee from a pool. So, you know, like that's not going to happen. The data has already been leaked. The other discussion then of course is: Do we pay ransoms? And this is a discussion that's happening many, many times over now, as it relates to ransomware. And it's getting particularly interesting, because we're now seeing things like ransomware crews like Evil Corp being on the US sanctions list. And if you pay them, you're going to violate US sanctions and then you have to answer the Treasury as to why you've violated sanctions. So is it even possible to pay ransoms? In the presentation I had today, there was someone in the call from a healthcare organization, who said, "Look, we literally legally cannot pay the ransom, because we're using public funds, and there is a legal restriction on our ability to pay any ransom with public funds. So effectively, if our healthcare company gets hit, like too bad. No matter how much the RIY makes sense, we can't pay it." So it is becoming a particularly fascinating area.
Paul: Yeah. István?
István: Well, it's... it's certainly a question whether, for instance, the CAO of a company for paying that 40 Bitcoins from the personal funds, not necessarily from the company or the organizational budget. Because that... the accountability, who is accountable is the organization and the organizational- the managements of that organization, at the end of the day. And it's because they decided not to invest enough, or they decided not to look into, you know, the data breaches and... And they are, you know, like the New York State introduced data fiduciary concepts in their regulation. They are data fiduciaries. They are- they have their fiduciary uses to protect the data, what they- you know, someone handed over to them. And if they cannot do that, then well, they're not doing their job. And it's- so it's if, for me, is- with all means you need to protect information, especially when it comes to this very, very sensitive ones. Like psychiatric cases. And paying ransom to, you know, prevent that is a price you need to pay. But obviously the question comes- it's not just: Can you pay the ransom? Because it might be terrorist funding and money laundering, all these questions. But it also that: Okay, you paid. But what is the guarantee that they won't leak that information? What is the guarantee that they will? I mean, these are not respectable businessmen for- for the beginning. So...
Paul: Yeah. Do you think that companies, organizations, are in a very difficult place right now, with- because there's... ? Here in GDPR-land, they've got GDPR. So there's a potential risk of fines about the way they're not following certain procedures for dealing with data properly and storing data and handling data. And maybe, in Australia, I don't know if you've got the same thing? I mean, I know that the US is starting to bring in some certain data laws as well, but I know that we, earlier on, I think last year, we had a bit of an issue for our organization in dealing with Australian companies, because there was some regulation change over there as well. How do you think that either regulation can help? Or is it basically there is not enough at the moment, to be able to support or prevent these data leaks from happening?
Troy: It's a... it's a very odd time where, for something which is so common and so universal - personal data - yeah, let's say, my name and my birthday and then we'll make it interesting, my sexuality as well. It's all very personal to me.
Troy: And if I live in the UK, I get a very different level of protection to if I live in Australia. And then I get a very different level of protection if I live in China. Yet we're all using many of the same online systems. Less so the Chinese, but certainly the Brits and us. And I sort of lament the fact that I'll see a data breach - and let's say it doesn't matter where the breach is from. We'll see a data breach. And a whole bunch of European people will pop up, who, for all intents and purposes, are equal to me. And they say, "This is it! I'm going to get all these protection from GDPR, because I'm European." And I'm like, "Well, what about me?" I have the same- the same rights. So we have some protections in Australia. We got our Notifiable Data Breach Scheme a couple of years ago, which is our sort of first iteration of mandatory disclosure laws. But fundamentally different levels of provision. So, for example, the three fundamentally different things to GDPR: Number 1 is rather than it being 72 hours before we have to report to a regulator, it's a month. It's like literally ten times longer. Number 2, if your company had revenues of less than- I can't remember if it was either 3or 3.5 million dollars a year. Basically, a number of which more than 90% of companies fall beneath, you don't have to report, because that might be hard work on the company. So it's literally like the organization responsible for the data loss doesn't have to report it if they're anywhere near a normal-sized company. And then 3rd thing was, if the breach is unlikely to cause serious harm to the individual, you don't have to report it, regardless of your size. So you can be a great big multinational, but you have a simple form somewhere, and because you say, "Well, it's just a form to comment on our cats, that's not potentially damaging. We don't have to disclose that. Never mind that all the passwords people are using are the same ones they'll use on their social media and their other very personal things." So we have a massive variance from different parts of the world. And then, I guess, the thing that then makes GDPR particularly odd is the assumption that it should apply to organizations operating in other parts of the world, because they may have the data of data subjects in the EU in there. And that other part of the world is then also going to be subject to all these other regulations, too. So I do fear that we're perhaps drowning a little bit in regulation, which to date don't seem to have had a fundamental impact on the data breach landscape anyway.
István: Well I... there are few regulations which is going on the opposite direction, like the Australian Assistance & Access Act. It's a recent one, two years- or not that recent, two years ago. They're basically- it's usually quoted as "anti-encryption law". Because it is banning strong encryption, so it's- and it's also that the EU recently, there are gossips that people are pushing to ban strong encryption. Or they want to have a backdoor in encryption, so that at the same time, there is a... sort of lobbying to weakening encryption, which is the fundamental- which is the bedrock of a strong security and strong security online. But I- well, I do agree, Troy, that we're drowning in regulation. Though I'm a big fan of GDPR. At least the principles what it introduced. Yes, they are- you know, the practicalities and how is it implemented and how it will- it comes to life. We need to wait a few more years. But I do believe that if you're putting the liability, which there was no real liability before - putting the liability on companies and well, basically, their shareholders in a way, because you are- so that the- the fees or the penalties are connected to the revenue, which is hurting the shareholders so that at the end of the day... then what you will see is more investments on prevention. More investments into cybersecurity. Because there is a bigger and bigger price tag of the damages of data breaches in the forms of penalties, but it's also in the forms of losing trust, because you need to publish it. That will, you know, make the security budget bigger. And I think it's- what is important is that the- to know that when you do security, you need to invest in it. It cannot- it won't happen without money. I mean, just pouring money doesn't help either. You need to make it in a smart way, you need to build the security program up. But if there is no budget and no attention inside an organization for such a crucial part, like protecting the data of the customers, then more data breaches will happen.
Paul: So I think that's leading to the point of where I wanted to come onto around education. And I think, obviously, we're becoming more aware of what's happening out there and people are starting to take it more seriously. Maybe also here, in GDPR-land, for the sake of GDPR being implemented. But what direction do you both see now, going forward? What do you think some positive solutions could be, on top of what we already have? I mean, we see a lot of breaches. We also got these, as you mentioned, password managers to put your password in, so we're getting help with our passwords. I mean, I see things like- like last night, I was watching a documentary with regards to China and the US, and the race for AI, artificial intelligence, and facial recognition. So is there potentially more security further down the line, as AI develops, for example? To help us keeping things very secure. Or is there a potential- more threats to our data out there, with content computing, for example?
Troy: I think it's- it's one of these ones where there's- it almost feels like I'm walking around the RSA conference. Yeah, it's like AI and blockchain and content computing. I was at InfoSec a year in London, the middle of last year, and I remember seeing a stand that was selling NextGen blockchain. What's the thing? So I'm a little bit conscious. There's a lot of buzzwords around this. Look, I think there are some roles for these technologies to play. Inevitably not as much to date, at least, as we've seen prophecized. I think that, going back to your point about education... one of the challenges is: How do we educate several billion internet-connected people about online security? About things like passwords? And this is a massively hard challenge. Now I've got a blog post I'm hopefully publishing in the next day or so about people being able to recognize URLs. And effectively, how bad we all are at reading URLs and being able to figure out whether you can trust it or not. Now this is not something we can educate away. I can't tell from the URL whether it's bad or not. Particularly once we have homoglyphic characters and homographic characters and then this sort of thing as well. That- the sort of things that we can practically do, which I think will help our privacy online, is just being more conscious about information reshare. Now an example I always give here is that something like your date of birth is knowledge-based authentication. This is what I went to Congress to talk about. It's static knowledge-based authentication as well. You can't change it. Yet it's regularly used as a piece of identity verification data. Don't hand that out to catforum.com. And I don't say that as a joke. Literally, if you go and register at catforum.com - I checked this, just because I wanted to pick like the most inane site I could think of - catforum.com will ask you for your date of birth when you register. Now not providing that data removes the risk of the website losing that data. I mean, this is one of the principles of GDPR and data minimization. Like let's not have data we don't need. Let's not give organizations data we don't need to give them. And now I'd like to see organizations being more responsible about minimization, and then us as individuals also being more responsible about minimizing the data that we willingly provide.
Paul: Because there is an attitude of maximization, as you've previously mentioned, right? So we want to maximize what data we can collect, and when we go online sometimes, and let's just say we want any book or we want some kind of information, sometimes the forms that we have to fill out are pretty in-depth about the information they require for us to get the information that we want.
Troy: Well, when I- when I announced that I was going to do that talk at Congress, and I sort of crowdsourced some ideas, someone gave me a really great statement, which I did actually put in my testimony, which relates to what you just said. And the statement was that organizations look at our data as an asset, and they never look at it as a liability. Now if it's an asset, they want as much of it as they can possibly get. If it's a liability, they want as little of it as they can possibly get. So we really got to try and shift that mindset from asset to liability.
István: And I think it's- it's a great analogy. But it's also what I would say- education, and the reason why I really like the Have I Been Pwned website - I'm sorry for my accent, Troy - so the reason why I really like, because it- that website makes it real. So that when I showed to my wife, for instance, her- that typing her email address in and then, "Oh, I have been hacked. Why?" And it makes it real. So I think it's important that people demanding accountability, so that the- yes, regulations puts accountability on someone, but the very person whose data is at risk, should demand higher standards. And if- it's only- it can only come from education. Because if people are not demanding from services to make sure that is- they are, you know, investing in cybersecurity, if people will not drop the service immediately after they realize that they're not doing a good job in protecting their data, then nothing will happen. So it's- it all comes to- yeah, to education of private individuals, who are- who want to watch cat videos and yeah... use a fake birthday or something. Or ask yourself, "Why does this website wants- want it from me?" So- and it- this type of education, this type of realization, is not necessarily an education, because it's more like people need to realize that they are at risk. That's a better- it's a big and important way of preventing.
Paul: Yeah. That might explain why some of the forms that I get from my- in my lead pipe tend to have fictitious names around them. Especially if they're CISOs and... etc. I just wanted to finish off, 'cause we've got five minutes left before we have to wrap up the podcast, and I'm sure you've got other things you need to get onto with your day. But the COVID situation, of course. Yeah, it can't be escaped. Can't get away from it. What's the risks now, with people potentially working remotely? Especially if you're working for an organization, so work from home, I think, is there a case that people need to be more security-conscious? Is the risk higher? Just because of this situation? What's your feelings around the COVID? Has it bought- let's say, has it brought an awareness, that awareness, forward? From working remotely. I mean, I know a lot of people have been working remotely for quite a long time, but certainly in Europe, we're a little bit more conservative. We have to go to the office. What do you see there? What do you see with the whole COVID situation?
Troy: I think it's a combination of the risks are higher by virtue of having more people doing work things outside the workplace without the time to properly prepare for it. I mean, we've seen things like the number of Zoom meetings people have Zoom-bombed and inevitably listened into things they shouldn't have, through to the risks of just simply being different. And a good example of the latter is: We had a short period there where our kids needed to work from home. And my son's sitting in his bedroom on his laptop on one of the first days, and I was quite impressed to see all the other boys in his class - he's in a boys' school - also sitting there on their laptops. And thought it was all very organized. And then one of them was obviously sitting in the living room, and his dad walking behind him on a phone call, having am business meeting, with like clearly audible information. And I'm like all these kids and who knows how many of their parents are sitting there like listening to this business call. So because of the way it changes the way we work, it's not just a question of we might have more direct risks online in terms of entering our credentials into a phishing page or something like that, but it changes a whole bunch of other environmental factors as well. And I think there's all sorts of aspects of being a sudden, unplanned remote workforce, that we're yet to see the full effect of.
István: Just here from me another react on the dad walking with a phone. I am also- when I'm working from home, I'm- the room, the working room is close to my baby's room. And my wife usually leaves the baby monitor turned on in that room. And when the baby is not there, the door is open, and it's basically the baby monitor is broadcasting my discussion to the air. And I'm- I need to keep on switching off or, you know, closing the door, not to broadcast the potentially confidential information, when I'm in. But it's- that's the environment when you are at home, you can easy- it can easily happen.
Paul: And with the COVID situation as well, I can see already some companies saying that, "We're not going to go back to the office." For example, there's some companies that have already come forward now, where you can have a permanently work-from-home situation. Again, is that something you feel like- hopefully, once we get over this COVID situation, where people will be more educated about the security aspects, not just the people themselves, but the organizations as well. Because we're putting a lot of trust in people's own infrastructure, I suppose. Right? We're- when I work from home, I got to have the right wifi, I've got to have the right security settings up, we've gone through, at Tresorit, a security training seminar to work from home. But I think there's probably still a lot of cases out there, where people just aren't aware of potentially the risks in their own home that they could be confronted with.
Troy: Well, you know, I think forever is a long time as well.
Troy: So let's just see how that works out. I'm sure there was a phase there where in those heady dotcom times, people thought they'd be having free lunches, literally free lunches, forever. And, you know, water slides and whatever else they had in the Google offices. I think this is one of these things which is going to take quite a while to adapt to. Even if we get to the point of- let's think of something simple, like unlocked laptops in a home. What is the risk that that poses? But yeah, then again we're also a sort of moving risk, aren't we? There are other risks that we incurred when we transited backwards and forwards and had those laptops in the car or in the shops on the way to work. So... yeah. But maybe one way of looking at this is that there are aspects of security that may not necessarily be better or worse, but they will be different.
István: What I- what I see is that it is pushing the organizations to really upgrade their system in the way of our system, because a lot of enterprises are currently looking at their IT system as a fortress. So that they're- they need to build walls around their, you know, information assets and around the people and around the laptops. And try to protect these assets. And recently, not I mean because of the COVID, but a few years ago, started this zero trust and this (inconclusive) frameworks type of thinking that, "Okay, try not to view your organization as a fortress, because if something- there is one infection, one malware or one weak point and someone gets into that fortress, they can get everything they want. So try to build inner walls. Try to make sure that you're securing the independent computers, you're authenticating the users, you're making sure that you're not assuming, when you're on the network or on a physical site, that will protect everything. So that this sort of thinking process, this type of architecture, came into life, and- before COVID, and more and more organizations will be pushed into this, because of, you know, single laptops in a single home. And as Troy mentioned, how to make sure that those laptops, when it's carried to home, back and forth, these laptops will not leak information if being stolen or lost. So these things was there, out there, as a security architecture before, as security professionals were always saying, "Don't trust your home network." Now it's just another motivation, another push.
Paul: Yeah, and I think - final thoughts on this - and this situation potentially brings for organizations an acceleration for moving to the cloud, potentially? Because of people working remotely, and maybe they don't have existing infrastructure that can cope with that very well. Is that something that you potentially can see as well? That companies will reorganize and take cloud adoption more seriously? 'Cause they were always those ones that'd been sitting on the line or sitting on the fence. Thinking that there was still that in there- you know, there's a risk with cloud services. What do you both see there now, going forward? Final thoughts.
Troy: A little bit like the last answer, where cloud doesn't necessarily make things less secure, even more secure. It makes things differently secure. And the sort of the example I always care to remember: Years ago, when we started going into the cloud. And let's go back, you know, five to ten years. And I'd sort of give the example of, well, you know, when we merged into the cloud, we very often have a much more focused, well-resourced team looking after your cloud things, as opposed to, you know, when we had the on-prem things. So, you know, that's a good example of where things change. It doesn't mean that they won't go wrong, it just means that the security posture is different to what it was before. So that I would sort of encourage people to look at each aspect of moving things to cloud, in terms of how does security change, as opposed to some sort of atomic "is it better or worse".
István: What I see, based on my own experience, that more and more companies considering clouds. And yes, with a caution, as Troy mentioned. With that, you know, "Is it good for us or not?" But at least, more and more banks. European banks were not really considering clouds, or using cloud as an alternative to their internal system. And now they are! And- so what we can expect that more information and more companies will think about and moving to the cloud. Not everything. For sure. Because, as Troy mentioned, it's differently secure than an on-premise system. And there will be systems which will be kept on premise for long. But certainly that will create new and bigger challenges, as we move on. That these type of conservative companies, sitting on big, big amounts of confidential data, is pouring that data into the cloud.
Paul: Okay. I think, well, the hour's up. I really appreciate you guys coming down and running this podcast with me today. And I hope that down in Australia, you'll have a successful and good evening. I think you're going to continue to write your blog, and I'll be following that quite closely. István, thanks for coming in this morning - early this morning - for us. And I wish you both the- all the best! Thanks very much for joining!
István: Thank you.
Troy: Thanks, guys!
Troy: See you later!
Paul: And that is all for today's episode of "under CTRL". You can find links to all our social platforms and to our guest in the episode description. If you like the show, make sure you subscribe and leave a review. Join me again in two weeks' time for the next episode.