How tech giants monopolize the digital space – an interview with Edward Shone
There is no better time to speak about privacy, freedom of speech and the omnipresent data misuse than today, when a documentary like “The Social Dilemma” is shaking up the public by exposing the subtle mechanisms behind social platforms.
This is exactly what we do in the next episode of our privacy podcast “under CTRL” with our guest Edward Shone, PR and Communications Manager at ProtonMail. By working with companies that have been long-term advocates of privacy, we intend to shape this debate and raise awareness. In our discussion, we unveil some further details about how established tech players like Facebook, Google, Amazon and Apple dominate the online world and what the perspectives for a private future are. Is this a utopia or are we running towards a dystopian world of completely eroded privacy?
ProtonMail, the end-to-end-encrypted email and VPN provider, was founded in 2014 by the scientist Dr. Andy Yen and his colleagues at CERN. In light of the Snowden revelations, their vision was to create a product that respects privacy and is secure against cyberattacks: the safest email service with encrypted attachments. The fact that ProtonMail’s and Tresorit’s data protection services have a shared mission and that we both keep a close eye on the data market offered some common ground for our discussion.
Even though many of the provocative questions we raise remain unanswered - whether they refer to data misuse by tech giants and governments, manipulative techniques applied by social platforms or regulations lagging behind the tech evolution - one trend seems to intensify, as Ed highlights: “There is a shift in the consciousness at the moment. It has emerged five, six years ago, but it is just gathering speed now - month by month, as people realize how their privacy and data have been abused by corporations and governments around the world.”
In our analyses of how the digital space has evolved and changed our entire life, Ed reflects on the birth of the internet, the initial idea of a connected world, free exchange of information and “the grand hope of democratizing democracy”. But the same tool that can liberate communication can also be misused as a tool for manipulation – and even for the subjugation of people. The way China is building its surveillance and censorship apparatus in Hong Kong is just one example for what happens when technology gets abused by authoritarian regimes.
From extreme government practices, we move on to more sophisticated ways of how our data is being used by tech corporations. Their masterly designed and continuously fine-tuned business model is turning people more and more into unaware victims of the complex advertising machine running behind these social platforms. As a consequence, the users who are getting these services seemingly for free, are in fact paying an even higher price, with their more and more shrinking online privacy.
Everyone can recall an aha moment, when Facebook suddenly “guessed" some of their unuttered thoughts that they hadn't even googled by dropping an ad with pinpoint precision. With our guest, we conclude that with the sophistication of behavioral-based advertising – as Ed formulates aptly – “the line between advertising and manipulation” is getting more and more blurred.
People have little knowledge about how complex algorithms are building a more and more precise profile about them by cross-referencing all data and traces they leave behind in the digital space. The abuse of social media culminates in it being used for mass political manipulation. Just think of the US presidential election in 2016 or the Brexit vote in 2018.
Despite these destructive data use practices, our outlook on the future is still a positive one. Ed sees the responsibility of data protection and security companies like ProtonMail and Tresorit in them offering services that help people “take back control over their entire digital lives”. He emphasizes, however, that the human element plays an essential role in this: “If consumers shift, the market will shift too.”
On the other hand, regulators have to do their homework, too. In this regard, the European data protection law (GDPR) has done pioneer work in the field of legislating how personal information must be treated. Even if these initiatives are evolving along a “trial and error” mechanism and with continuous self-corrections, many nations are following the trend and are adopting their own privacy laws.
Listen to the episode on Spotify or Apple Podcasts to learn more, and let us know what you think. Scroll down to read the transcript of the recording.
Paul: Hey everyone! Welcome to the eighth episode of "under CTRL". My name is Paul Bartlett, and on today's show is Edward Shone, the PR manager of ProtonMail, from Switzerland. We will discuss how ProtonMail is supporting freedom of speech, and how tech giants, such as Facebook, Google, Amazon and Apple monopolize the digital space. Hi, Ed! Thanks for joining us on the show today. I really appreciate you coming on and joining us, and giving us some time. So... I know that we're both- both organizations are advocates of privacy, right? So... and I... hopefully we're going to have some really good conversations today around that. And enlighten some of our listeners about what it means for us, for each organization. And- and come to the conclusion on where the future of privacy is going to go. So I want to hand over to you to start with. Give us a little bit of an introductions about yourself and what does ProtonMail do, what it stands for, you know, how was it created... So the floor is yours.
Ed: Hey. Well, thank you for having me. Pleasure to be here. So yes, my name is Edward. I work for ProtonMail. I look after PR communications over here. ProtonMail, for those who don't know, it's a... we're a company- it was found
Ed in 2014 by a team of guys who met at CERN. At the... the (inconclusive) for scientists. And we are primarily known as being an encrypt
Ed email provider. So end-to-end encryption, utterly secure. We can't read your emails and nobody else can neither. Around 2017, we launched ProtonVPN as well. And we've continu
Ed developing a suite of products since then. But as far as us as a company, we, as I said... it was found
Ed in 201 by Andy Yen and his various colleagues at CERN in- in light of the Edward Snowden revelations.
Ed: They looked around and saw what was happening, saw these revelations about the way data is being collected or being corrupt
Ed or being abused, and thought, "Well, frankly, this isn't- this isn't quite right." And set about trying to... or think about ways they can help build an internet, which is, you know, protects privacy, respects- well, respects people's privacy, respects people's ownership of their own information, is secure against cyberattacks. And... yep, they started with email. And frankly, I think, it seems to have gone from strength to strength ever since. The... a lot of the early... Well, I should- I should say that the company wouldn't have gone off the ground, frankly, if it wasn't for the amount of support they received from just people out there in the community. There was obviously an appetite from people around the world for a service like Proton. And hundreds of thousands of- of dollars was raised through crowdfunding in the space of just a few weeks, to help get the project off the ground. And ever since then, it's been an entirely sort of community-focused organization. We wouldn't exist without our users. And I think, the fact that the company has continued to develop and- and grow and, you know, improve in the way it has over the years, is entirely down to the fact that there is a bit of a change happening out there. I think it's the same thing that you guys are seeing as well, and the reason why we are growing in the same way. It's that there is a shift in the consciousness, I think, at the moment. And it's, you know, it really started to emerge, you know, five, six years ago, but it's just gathering pace now, month on month, as people realize how their privacy and their data is being abused by corporations and by governments around the world, be that for commercial surveillance or, you know, what do you call it? Espionage capitalism. Or if it's, you know, being misused by governments for- for their own purposes. It's- there is a- an- a... awareness of the ne
Ed for services that brought us privacy. And that's what we're here to do.
Paul: Fantastic! And- and your role as the PR manager, of course... you mention Ed that you've been with the company, was it four- four years, did you say? Or...?
Ed that you've been with the company, was it four- four years, did you say? Or...?
Ed: Oh no, sorry, the company's been around four years. I've only actually been with the company for... actually, what's the date today? Eight- actually, today is my one-year anniversary at Proton, as it happens.
Paul: Oh, okay! Congratulations! Congratulations.
Ed: Thank you!
Paul: Good to have you on the one-year show. So...
Ed: Exactly! Thank you very much.
Paul: So your role basically is, obviously, to oversee all the media and communications. And- and you're obviously doing that with ProtonMail, right?
Ed: Yeah, exactly. So I- I was previously- actually, it's quite a- it's been quite a great journey for me, actually. I was previously working mostly in a corporate environment.
Ed: So I worked for a number of different companies back in the UK. I moved over to Switzerland for personal reasons, but was looking for something a bit different. I'd been sort of, you know, working for the man, as it were, for quite a while. And just looking for a role where I felt I could make a bit of a difference. And, you know, thankfully found- found Proton. It's- it's, honestly, quite nice to be working for an organization where you actually feel like you're doing something good in the world, you know?
Paul: Yeah. Yeah. And I can understand now- and we're totally aligned with that as well. So I got the question coming that- one of the things to- ProtonMail stands for is obviously privacy, but also freedom of speech, right? So... there's a lot of that going on in the world at the moment. Of- talk of freedom of speech being suppressed in certain countries and- and certain places around the world. So what is it that you do to support freedom of speech... and democracy?
Ed: Well, I think a good place to start is to sort of take a step back slightly. I think there- if you go back to sort of the birth of the internet, there was...
Ed: ... a feeling that this new sort of utterly egalitarian platform, world, sphere... whatever you want to- word you want to use, was going to create this amazing opportunity for the free flow of information, free flow of communication. Anyone could access any information or talk to anybody. It was going to democratize information and, by extension, there was grand hopes for democratizing democracy. You know, it's letting people have a greater say in their own lives, understand the world around them. But at the same time, whilst it gave people the tools to communicate, it also gave authoritarian regimes the- the tools to clamp down on the same thing. If people become too reliant on, you know, the internet to communicate, and it becomes in many ways easier to- to intercept that communication. So one of the- one of the great things about encrypted email and, by extension, about VPN is the fact that we can allow people to communicate privately. You know, because we use end-to-end encryption, it's only the sender and the receiver can access what's actually in people's emails or their- their attachments. Even we can't gain access to what people are saying. So people can- our users can safely and confidently communicate without the fear of, you know, a regime, whoever that may be, intercepting their communications. Similarly with VPN. And we've seen- you mentioned- well, we've seen this happening a lot, an awful lot, in various parts of the world. As governments have become more sophisticated in the way that they monitor people's activity online, the way they collect information about people. We've seen a greater need for people to use VPNs to protect their- their privacy. And you hear there's sometimes a- a misconception that, you know, people are using VPNs because they have something to hide. But it's- in many, many parts of the world, that's not the case in the slightest. People using VPNs because- not because they have something to hide, it's because people are trying to take something from them. And that- what they're trying to take is information about what they do, where they go, who they speak to, what they're interested in, what they're searching for online et cetera, et cetera. And so we really feel that there is a responsibility on companies like ourselves to help people just live in a- in a- in a private manner, and not have their every move monitored and watched, frankly.
Paul: Yeah. Yeah. And I- I tend to align with that as well, because the, you know, the mission of Tresorit itself is very much aligned with what you're doing: of giving people privacy and not allowing your documents to be seen or scanned or being even hand over- handed over with a subpoena, for example. Like a backdoor capability into getting access to those documents. So it's a touchy subject. I mean, there are some people out there that are pro and some that are not so pro towards it. Because, of course, there's- where there's these capabilities and technologies, as we both understand, there's potentially also criminal activity. So there's always the good side and the bad side, right? So...
Ed: It's true. But then again, you could say the same thing about locks on your house, couldn't you?
Ed: And there's- I- you could say, if somebody haven't got something to hide, if, you know, you're doing something illegal- you're not doing anything illegal, why should you lock your front door? Well, this is my house. You know? It's- and it's fair...
Paul: Yeah, good analogy.
Ed: What- when you mention about, you know, subpoenas and things, it's- it's another interesting area that's, I think, going to become more and more relevant as we go forward. I think, actually, companies like Tresorit and companies like Proton are actually quite important in this. I think after, you know, the revelations with Snowden in 2014 and the wiretaps and all that sort of stuff that came out about the NSA, obviously there's a huge public outcry against that. And- and all these programs were quite rightly dismantled. But instead, what we're seeing is a shift in that, you know, governments don't really need to wiretap their populace anymore. Because people like Facebook and Google and Yahoo and Twitter and all these giant tech companies, because they exist and are collecting data on people on a day-to-day basis, all it takes is a court order from any country to claim this data on their users. And as long as people are using things like, you know, Gmail or Google Drive... both competing- they probably still compete against us. This information will be readily available. We did some research about- we published it a couple of months ago? Three months ago? Something like that. Looking at the number of requests that are coming from governments around the world to, I think it was just the big four, it was Facebook, Twitter, Yahoo, Google. I think that was it. And it's- it's mind-blowing. It's- the last six years, five hundred per cent increase in data claims by the US government from Google. It's- well, right across the board. If you look at all members of the Fourteen Eyes community, they're all- the number of requests has just skyrocketed. All of them. It's because they don't need to wiretap anymore. They just need to,...
Ed: ...you know, borrow the data from companies that aren't looking after their users' data.
Paul: Collect it and merge it. Yeah.
Ed: Yeah, exactly. Exactly.
Paul: Yeah. And to- on that point as well, you mentioned obviously the US government. And we know some of the things that- that have happened in the past over there, about the way that they collect data on the populace. Let's just reflect on recent times, when- what's happened in China, for example. With Hong Kong. The fear that's coming out of there now, I mean, is it- with the new- with the law that's just been passed, is there that fear as well? I mean, we- it's been known that the Chinese government is always monitoring and collecting information on the population. What do you make of that? What is the- the whole situation in China and Hong Kong?
Ed: Oh, it's... it's quite scary, really, isn't it? It's... it's- I- I don't know what to make of it. It's quite a- quite a broad question. I mean, I had- I probably can't name names right now, because it's not publicly- publicly- it's not in the public- in the public sphere just yet, but I had the good fortune to sit down with a- an actress from Hong Kong quite recently. Just to discuss the situation and get to understand it better. And it- quite frankly, it's- the situation is really quite terrifying. They- or in- Hong Kong in particular, is in- it's- it's the forefront of a clash between, you know, freedom of speech and democracy and authoritarianism. Is- the situation is getting steadily eroded. Sorry, the rights and privileges of the people of Hong Kong are getting steadily eroded. And it's happening in two ways: that's on the one- on the one front, there's, you know, people and cameras out in the streets, keeping an eye on who's waving flags and who's, you know, shouting slogans and things. And on the other hand, there are armies of people and- and highly sophisticated technology, keeping an eye on what people are saying, who they're saying it to, what they're sharing, what- what websites they're accessing, et cetera, et cetera. And it's- and this is the- Proton was one of the companies that saw a significant spike in- in uses.
Ed: But the day after new security laws were announced, I think, over those two days, we saw a three thousand per cent increase in sign-ups. And- and- yeah, we- and you can't blame them, frankly.
Ed: You see- it's a- if I were sat in Hong Kong, if I were a Hong Konger and I was looking at these laws coming in... and they're quite cleverly opaque.
Ed: Opaque? If that is the right word. In that's- you know, they can be interpreted in many different ways. They're quite loosely worded. And if I was to look at those, and then look at the way that the law has been implemented in mainland China, I'd be- I'd be terrified. I think it's- I'd be scared for two reasons: number one being that's, you know, they- the government is- well, there has been a track record. Sorry! I should- I don't want to be too inflammatory, but there is a potential for, you know, freedom of speech to be eroded, and the access to information to be pulled away. If you look at the App Store in- in China, for example. There's, you know, three thousand different apps that have been blocked by Apple, presumably at the behest of the Chinese government. The next worse is the US with just one thousand, and most countries have sort of two or three hundred. You know, obviously they get people- this is just for reasons for blocking apps. You know, you're not going to let a sort of- an ISIS recruiting app appear on the App Store. But, you know, three thousand... like CNN, for example. Even- even Fox, CNN and Fox and BBC and all of them. They're all gone. But so... anyway. Going back to the original point: I think it's- it seems like a- it's a scary situation. And, I suppose, going back to what I said previously about the opportunities that the internet offers, there is no greater tool for the subjugation of a people, if- if not properly used. And I think there is a responsibility of all tech companies in the area, in Hong Kong, to sort of stand up and- and stand by the rights of the people of Hong Kong. I think it's what we've done, it's what a number of other companies have done. But, you know, for the time being: unless people stick around and- and offer tools to the people of Hong Kong to protect their privacy, to protect their freedom of speech, then the erosion of these rights will just happen faster and faster.
Paul: So just moving away from that, I wanted to come into the- the point of what's the- you mentioned Amazon and Apple earlier on. And a lot- and I think Google, of course, they've got a lot of free services. And that's the- that's the attraction, right? So the- you can get a free Gmail account, you get a certain amount of storage... What's the issue from your perspective about these free services? I mean, a lot of people out there, they go for features and functionality more so over security. And it kind of draws people in. And I give you an example from my- we've got an Xbox at home. And the other day, the privacy notice popped up. And I took the time to read through it. And literally, it- I was in shock about the amount of access, the information that I'm going to be sharing, potentially, through that Xbox, or my son even, through that Xbox, with- with them. And what they can do with that information. So... yeah. Is this- is this the problem? With- with free services, that you basically- you don't really have an- an opt in or an opt out clause. Because what I was reading, it was more or less, you want to carry on using this Xbox, then you got to- you're going to have to opt in to these- these terms and conditions.
Ed: Well, I think that's- I think you hit the nail on the head, really. It's a- it's a lack of choice. And like you say, and to be fair to the likes of Google and everyone else, I mean, their products are great. I mean, I- I signed up for Gmail like, what? Ten, fifteen years ago, whenever it came out. And it's really easy, it's simple, it looks great and bla bla bla. And yeah, fair play. But there's- I can't remember who it was who says it- who said it originally, but there's that phrase, you know, "If you're not paying for a product, you are the product."
Ed: And at the end of the day, with Facebook and Google and these free-to-use services, the only reason that they're able to give you these lovely, pretty, user-friendly products for free is because they're making their money elsewhere. And they're making their money out of you. You are the product. If you're- I suppose, in many cases, if you- I don't know, you buy a- you buy a phone, and you pay a service for that phone every month, and you think yourself, you use it and you are the customer of that- of- I'm with Sunrise. The phone people. I'm a customer of Sunrise, I pay them a fee each month, and I receive a service. And so in the same way, you still think of yourself as a customer of Google, or a customer of Facebook, when you have an account. But you're not. The- their customers are advertisers. And the only reason- this has been said a million times, I don't think it's really news to anyone, but they- the reason why Facebook and Google and these free-to-use platforms continue to make vast sums of money is because they have created the ultimate data resource for advertisers. It's, to be fair- to be fair, this is a genius business model. It really is. I was- like a piece of research I was playing around with the other day, which really fascinated me, where - if you look at the advertising revenue of Facebook over the last I think it was five years, and then look at the user numbers - the increase in revenue is two, three times faster than the increase in user numbers. And if you were to think of Facebook as an advertising company, which it essentially is, if you were to compare it to a billboard company, for example, you would have thought that, you know, the more billboards you- a company has, the more adverts they can place, thus the more money they can generate. But Facebook haven't been getting more billboards. The user being the billboard. They've just been getting more creative in the way that they use this data, to find more ways of advertising in more different venues outside their own platform. It's- but they get- this- at the end of the day, this is the problem. It's that it may seem like a... innocuous thing, you know, "I don't mind them knowing that I like the White Stripes and English rugby, and therefore can target White Stripes CDs and rugby tickets at me." That might seem like a no-brainer. "I don't really care. I can use a free service and, you know, I just ignore the ads." But we've seen in the last few years how- what might have started as just some slightly clever pop-up ads, has developed into full-blown manipulation. You know, it's- we've seen it happen politically. We've seen it happen with convincing people they want to buy products they've never even heard of, or don't care about, or never normally do. It's- once you have enough data on somebody, you can get them to do pretty much anything. And that's the scary thing from my perspective. It's- yeah. So they- you asked what the problem is with free services. It's, you know, a little bit of data might not sound like a terrible thing, but I've been using Gmail, or I- I had- I've had a Gmail account for fifteen years. I only really use it as my burner account now. I just- you know, if I have to get cinema tickets, then they go- the promotions get sent to my Gmail account, and I ignore it. But fifteen years worth of data, if I was using it day in, day out... that's incredible, isn't it? It's- it's- it's a vast amount of information. So yeah, I guess that's the problem. And that is what we and yourselves are trying to change.
Paul: Yeah. I mean, maybe I- it took me a real while to realize. And I'm on Instagram, like a lot of other people as well. Not on Facebook, but... one day, I just saw these ads popping up. And I thought, "I've just had a conversation about that, but I never looked for it. I never searched for it." And I was taken aback by the fact that- yeah, when I accepted the terms and conditions, did it say that it could listen to my, you know, like basically put on the recorder or voice speaker in it, and then start listening to- to my voice? Because nobody really checks the terms of service, or the terms and conditions to which you use these applications. It's- okay, it's cool. Especially for young people it's cool. "Let's download it, let's get it one the phone." The next minute, you've got a load of targeted ads about the things that you've been talking about, which you never ever searched for. You know, if you remember, when you search for something, then you potentially would get a lot of targeted ads. Now suddenly, it goes one step further. And then, could it go one step further even more? When you put your location services on, it's like, "Where are you in- in the area? And is something relevant to you?" So it's just mind-blowing, I've got to say.
Ed: It's quite- it's quite amazing. Actually I'll tell you the one that scared me the most. And I still- I think it's got to be a combination of location data and voice recordings or something. But... I think it was three years ago, I was visiting my brother in Bangkok. On the day that my- my wife and I were due to fly back, we went to one of those big street markets. Just because it was on the way to the airport, and we'd heard they were kind of fun. I wasn't going to look for anything in particular, hadn't googled anything in particular. We just, you know, heard, word of mouth, it was there. Got a taxi, off we went. While we're there, I saw- I bought a sort of canvas bag thing. Like a sort of holdall type- type of thing. It wasn't branded, it was- it wasn't even a fake brand. It wasn't even like an Adidos, or something. It was- it was- it was nothing. But I thought it was kind of cool. Bought it, put it into my other bag, because I was going straight to the airport. Got on the plane. We changed flights in either Abu Dhabi or Dubai, can't remember which. And had a sort of two-hour layover, so I turned on my phone. And I saw a pop-up advert on Facebook for the exact same bag. I mean, I hadn't googled it, I paid in- I paid with cash. So I hadn't- it can't even be like transaction data. Absolutely bizarre. And I don't know if like they had known I was in the market, and there were sort of three sellers of this bag in that market or... I don't know. But absolutely bizarre. Scared the hell out of me.
Paul: Yeah. Yeah, and in fact and to that point is like- is- when you mentioned that, I just- last night, I was in IKEA. And they always ask you to put in your post code. And then I started to become more and more suspicious about how that data is all coming together. I mean, it's coming from different sources, but if somebody wants to see, you know, my purchasing behavior, my- my credit card going in for when I'm purchasing for goods. My data location from different providers... It's bringing all of that stuff together and building up a whole profile on this. And the- and I think for me now, it's starting to sink in. It's starting to really sink in about the information that you're sharing constantly. Like you're constantly on. And that to the point, with my son as well, he is constantly on. And how much would- information they collect about him now, and what it's it going to be for him like in the future with them collecting all of this information? Because he's born into technology, you know? We- I was one of the ones that watched it evolve from- from nothing to something. And along with the internet as well. But for him, he's just like: it's always been there. You know? Technology has always been there, online has always been there.
Ed: It's quite amazing, isn't it?
Ed: I mean, I always sort of think of myself as being unfortunate that- I think Facebook came out when I was half way through secondary school? Something like that. So I was sort of one of the early adopters. I've- I don't use it anymore. I don't post on it, nothing like that. I've sort of removed all- frankly, it was about a year ago, I sort of realized how much stuff is on there, from posting statuses when you're sixteen years old and stuff.
Ed: It's- I think it's ridiculous! But if you've had this- at what- is it seven that people are allowed to sign up to Facebook? Or something like that? There's- there is an age limit. I can't recall what it is.
Ed: But if you're- you're on, yeah, Facebook or Instagram or whatever, from the age of seven. By the time you're thirty, they'll- they'll know everything about you. Absolutely everything. And, you know, like you said at the very start, you don't- if you want to, you know, be at the party, you haven't really got any choice. You can't say no to the data collection. Either, you know, you stay out in the cold and, you know, don't get on Instagram, or you- not that- I suppose that's not the end of the world, there are worse punishments. But it's- yeah, you have no real choice. It's quite- it's quite amazing.
Paul: Yeah. And yeah, just moving on from that, I suppose what it is now, is because we got this monopolies, and they're difficult to break, like Facebook, and you got your Apples and your Amazons, of course, when you're shopping online. I mean, is this- is it time? Do you think there should be some kind of breakup of these monopolies now? When you see these big giants in technology. I mean, is this something that really needs to be taken seriously? I mean, the EU's kind of attempted to address the problem, or the issue, of large tech.
Paul: But where does...
Ed: I mean, yeah. There's- I mean, like you say, the EU is taking steps. We're starting to see rumblings in- in the US. You know, Congress are doing the hearings. There's the rumors that the DOJ are going to make an announcement on Google any day now. Although it looks like it might actually be focused on a very specific part of Google's business, which kind of defeats the point. But anyway. But I think that- well, it's not for me to say should it be broken up. And- and I don't necessarily think, actually, I just don't know. I don't- it's not for me to say if they should be broken up. But I think there definitely is an argument for better regulation. I mean, the energy sector, the pharmaceuticals industry, legal profession... I- every industry or sector, which touches everyone's life, is regulated to one degree or another. But, you know, there is comparatively little for tech. And, yeah, I think there is- there is a- there are things that could be done. I mean, we've been calling for things like regulation of the way that App Store owners operate. So primarily- I suppose primarily we're talking about Apple and Google here. But if- realistically, if you take just say the App Store, Apple's side of it. They're home to two of the primary operating systems in the world, iOS and macOS. Basically, they control access to, you know, half of the internet, really. They can dictate who- I mean, I'm exaggerating slightly and talking in- in pejorative terms, but they can dictate, you know, who can be on the platform, what the apps can do. And they can, well, maybe charge incredibly punitive fees, which can run smaller companies out of business. They- their own apps are on the same App Stores as well, so there's very little regulation over making sure that they treat them, their own apps, in the same way as other people's apps. Like I mentioned earlier on, there are cases of App Store owners censoring certain apps in certain parts of the world, to maintain a market access to those parts of the world. So yeah, I think it's- there just has to be a- a long, hard look at the way these companies are operating, and making sure that, actually, it's working right for- for users. And, like you say, bringing back the privacy point: At the moment, both Apple and Google charge- well, at the moment, particularly Apple, sorry, charges a thirty per cent fee for transactions that go through the App Store. Through this or in-app purchases. If you're a- a Google or Facebook who- you know, you had your apps in the App Store, but you don't rely on, you know, financial transactions, you make your money elsewhere, then these sort of things aren't going to affect you in the slightest. If you're a company like ours, work- both our businesses, where our revenue does come from subscription fees or whatever it might be, then- and we're doing that to make sure that we respect people's privacy. We don't make our money off their data. Then we're the ones that are going to- that are more detrimentally affected by things like App Store fees.
Ed: The longer that these fees exist, and they are- and are applied discriminately, the more it's going to maintain the status quo, where, you know, the biggest companies abuse people's data, and those who want to protect data struggle slightly. It's... yeah. I- it's a big, big issue that I think- there's going to be quite a lot of soul-searching in various corners of the world, I think, before we find a concrete solution. And also, I think, there's also an element of, you know, people power in these things as well. I think- I think the- It's almost like a sort of chicken and egg. Is the regulation going to change before people demand it? Or are people go- demand it and therefore the status quo changes, or are people going to sort of move with their feet and therefore the situation changes when it comes to the rules governing App Stores and big tech and monopolies and that sort of stuff? It's- I think it's difficult to say. But... time will tell.
Paul: Yeah. Certainly, yeah. I know- I suppose when I was thinking about the breakup of these- what I wanted to get to is, I mean, they're having such an influence on society, right? And... and I think maybe "breakup" was the wrong word, but certainly, like you say, a control or a mechanism, like a regulation that says, you know, what's acceptable to do and what's not. Because it seems like you can- you can do anything, right? With this. You can post what you want, you can say what you want, this is freedom of speech, of course. But I think there are also other things, as we mentioned earlier before we started this- this podcast, around The Social- The Social Dilemma part. Which is- of course, that's been around for a long time. But when I sort of see that documentary as it were, and I think about the amount of kind of ad manipulation that we were talking about earlier going on, and how much- how many people- I mean, I don't know how many people are on YouTube. I think it's like nearly over a billion people, or even more than that.
Ed: It's incredible, isn't it?
Paul: But yeah, I mean... these algorithms are just working in the back, and just studying each behavior, and they're dropping these ads in. And it's not just any old ad now, like say... So I was just thinking more along the perspective of- probably of, like you say, not breaking them up, but having more regulation and control over them.
Ed: Yeah, I agree. It's- I don't think- there's- there's a growing tendency to talk about this in terms of sort of "money bad", "business bad", et cetera, et cetera. And I don't think that's necessarily the case. It's perfectly possible for, you know, a company to make money or a company to operate and be successful and- and to be fair to- if you take Apple, for example, they got to the top, because they produced a long series of very good products. And people wanted them. And they worked. Brilliant. Fair enough. I don't think there's really an issue with a company being successful. The issue is how responsibly they act once they are successful. And I don't think it's unfair to say that if a company acts irresponsibly, then there is an onus on, you know, regulators or whoever it might be to- yeah, to try and rectify the situation in some way.
Paul: Yeah. So do you think there'll ever be a situation for non-behavioral-based ads, so you're just doing- for a keyword- keyword- key search, rather than tracking people's behavior? Because if you take YouTube, for example, they- they drop the ads in, and if you want to- if you want to get rid of the ads, then you got to pay for a premium now. Yeah?
Paul: So you can pay- and this is what a lot of- this is what a lot of apps are building out. It's like- they track that- your behavior, then they drop the ads in, and if you want to get rid of it, you got to pay for the privilege of getting rid of it. So do you think we'll move towards a situation where maybe that becomes regulated?
Ed: Well, possibly. It's a good question, actually. I haven't really thought about it that much. Yeah, I think it's- it's an interesting point. I mean, when you say "non-behavior-based ads", then I assume you're sort of meaning it's not based on data regards, previous activity et cetera. It's just, you know, a bit like a billboard.
Paul: Yeah. Yeah. Yeah.
Paul: Yeah, I mean, or keyword search. I mean, it's just going to be like what you search for, rather than behavioral monitoring of- of what you're doing and where you're going, as you just mentioned about your bag.
Ed: Yeah, exactly. Yeah, it's... okay, yeah. So... I mean, it's a difficult one. I think it's- it's quite a broad question, really. I mean, the- the- there is nothing wrong with the advertising industry, as it were. Well, actually, that's a very broad statement, I'd take that back. It's- the- the concept of advertising in general, I mean, companies have products and services, and they're more than welcome to advertise them. That's fine. But, I guess, the question is where the line is between advertising and manipulation?
Ed: And if a... advert or a campaign or whatever it is based on, you know, harvesting data from as many different points as you can to find the best way of manipulating a person into buying that product, then that's a bit creepy. That's not great.
Ed: Then again though, at the same time, if people are fully educated and they know what they're doing and they're aware that they're using this service and will be harvested- the data will be harvested for advertising or whatever else, is it wrong to tell somebody that you can't use that service? I don't know. It's a good question, really. It's- I guess, the world we're trying to build is one where people just have control over their data. I think that's- that's the end- and I don't want to sound too much like, you know, the Brexit campaign, but, yeah, take back control and everything.
Ed: But it's- I guess that's the key point. I think a lot of people feel like they no longer have control over their data, and- and their data is online, their data is their identity. They don't have control over themselves. And so using services like- like yours and ours, if people can use that to make a decision about who they share their data with, then I suppose that's okay. If you say to somebody, "I don't mind you having this data," and it's all above board and that's fine, then that's like a different question, isn't it? But it's- I guess the problem is, you know, with behavior-based ads, as we have seen and see them now, is that often people don't realize what's going on.
Ed: Realize that they- or they might think that, "Okay, they know from my Facebook likes that I like, you know, Kentucky Fried Chicken and Barbie dolls" or whatever. Who knows. But they don't realize that that's actually been cross-referenced against billions of Google searches and YouTube searches and time spent on individual videos and how much time you lingered over that picture of that person you went to High School with and haven't seen for fifteen years. It's- it- again, it's at that point when people don't- yeah, I guess, that's the- that's the thing I have an issue with: when people don't know. That's the scary thing.
Paul: Yeah. And I would go to one step further than that. Because, as I mentioned, my son's got his Xbox, even into the world of gaming. So when they're (inconclusive), they're playing games, their purchasing in- in- in-game products. And then these in-game products are being, again, advertised while he's watching YouTube. I've witnessed it myself, about- they follow these gaming channels, and it seems now that this information is- is coming out of one system and getting put into another. And- and they've got everything that they can basically do their best, or try their best, to sell you, or sell young people, that- that product. And that's- that's the bit where I can see that it's getting a bit scary for- for me. Yeah.
Ed: Also the genie is slightly out- it's kind of out the bottle, isn't it? It's- that once these data collection resources are built, they sort of have a life of their own to a degree. And these- the possibilities are endless. When you see interviews with - I forgot his name now, but... the- the chap who is in charge of monetizing Facebook, who have push
Ed forward the whole of ads, (inconclusive) I've completely forgotten his name now. And you almost get them impression, like in the early days, it was, "Okay, cool. So we can basically use Facebook as a search engine for targeting adverts. That's really cool. So we can find out, you know, males between the age of twenty-four and twenty-nine, who live in English-speaking parts of the world and... okay, here's your advert!" And now they just went a step further. Another step further. Another step further. And it's- as soon as you create an ecosystem where data is a free-for-all, and it's not- no longer the property of the- the- of the user, then these things will just continue to evolve. And, you know, it's- one day, it's, you know, targeting adverts that, you know, adults shopping on Amazon who have credit cards and are responsible, and they see, you know - it's, like you're saying - in-game adverts for children and- and oversteps even further than that, when, you know, people realize that these platforms can be use for sort of mass political manipulation. You just have to look at 2018 and- and... 2016 and 2018 and the Brexit and the US votes, and Trinidad and Tobago before that. And...
Ed: And on a day-to-day basis. I was reading a- a thing the other day. And I'm- I'm going to delve into the world of conspiracy theories happening now, and please forgive- please forgive me! But there's some...
Paul: I'm curious. I'm curious! Tell me!
Ed: Exactly! There was this- it was a really funny thing I read, which said that there was a vast amount of suspected Russian bots on Twitter, who, rather than talking about, you know, "Build a wall" or "Take back control" or whatever it was, were talking about vaccines. This was pre-Covid. They were talking about, you know, vaccines give you this, that and the other. And "don't vaccinate your kids," which, you know, most people would consider, you know, conspiracy nonsense. Not to cast aspersions of people who have these- have these beliefs, but that is some general opinion. But there's- apparently a huge number of these bots were talking about this, which rallies people up. And you think, why on earth are there- are supposedly Russian bots on Twitter give you a talk about this? It was about a week after I read that article that there were evidence hearings in Congress in- in Washington about vaccines. Something which has been, you know, scientifically accepted around the world for sixty, eighty years, however long it is. And instead of spending time talking about, you know, suspected allegations from the election or hacks of the DNC or whatever else, members of Congress were having to spend their time placating public demand for a conversation about vaccines. And the only reason they were able to do that is because they- these platforms with- where- these platforms exist. It's quite- it is really quite amazing, isn't it?
Ed: It's- no one ever suspected that, you know, Facebook would- or Twitter would turn into these things which can shift the mood of entire nations.
Ed: But... yeah. But it's stunning. It really is.
Paul: Sorry! I think, Ed, we're coming up towards like the last ten minutes or so of the podcast and...
Ed: Yeah, sorry! I've- I waffled there, apologies.
Paul: No! It's great! This- this is- it's a cuppa tea time, and it's time to waffle. And that's what you would do if you were down the pub or just round the kitchen, having a cup of tea. And, like I say, I'm sure there's some golden nuggets that people can take away here. So where- so let's just think about the future now, and the direction we're going in. Big advocates of- of data protection, and we're glad to see things like GDPR coming in, end-to-end encryption is being more endorsed, not just for the individuals, but organizations as well. Where is the direction ProtonMail is going in? I mean, guys- I know you guys are following the same way. Is there anything that you're big advocates of? That you want to see being changed in the future? What's the perspective from ProtonMail? That's a...
Ed: So, I guess, the direction that we're going... I think we're just trying to build- we're building a- yeah, a selection of products, which hopefully can help a person take back control of their entire digital lives. It's a long-running process, it won't be finished for a while. But that's the direction we're going in. As far as the- what the future holds, it's- it's almost like a- it's almost an "I don't know." It feels like the momentum is building behind encryption, behind privacy. If you were to go back five years, or maybe actually less than that, you'd hear, you know, people talking about how encryption is bad, it protects crime, it does this, that and the other. But people are realizing actually, if it wasn't for end-to-end encryption, then the global economy would collapse, because everyone's personal information would be everywhere. Beyond that, I think it seems like there's a growing momentum, an awareness of how data is used, how you can take control of it. Services like- I got- I- the two of us, it seems like there is a public awareness building. And... and people are starting to make the switch. But, you know, I- and I would, you know, I'm going to put my money out there slightly, I'd hazard a bet that, you know, the world in five years time will be a more private place, because I think that is the way public opinion is going. But at the same time, yeah, this all depends on public opinion and people moving with their feet. I think, as long as there is a market for the established players at the moment, as long as people continue to use them in the way they have done in the past, then they'll also could be able to continue operating in the way they are. And, you know, by extension, governments will be able to continue their action the way they are as well. But if people start, you know, reassessing their life online, if they start thinking about- you know, we see a lot of people who use us for the important stuff in their life. You know, they all use us for legal documents and banking information and that sort of stuff, and have sort of a burner email address elsewhere. Which is a step in- in the direction we're going. And if, you know, the world continues going that way and people's behavior continues going that way, then, yeah, I could see the world being a more private place. But it's- it all depends on you, as it were. It depends on what people do, like... The fact is that we live in a market world, and if the markets shift, then- if consumers shift, then the markets will shift, too. But that's just the way these- the- time will tell. I think it's interesting that- that Zuckerberg and people like that were talking a lot more about privacy. I think that's quite a telling thing. He said, was it six, seven years ago, "Privacy is dead." Or "The private life is dead," or something like that. And now, last year, he's saying, you know, "Facebook knows- is done badly, and we'll- we are creating a new product world for you, where we respect your privacy," et cetera. So I think if the big players are shifting their tone, then that's quite a telling thing. They're not idiots. They can- they can see when markets are shifting. But, yeah, time will tell. What do you think? What do you think is going to happen?
Paul: Well, I... I mean, I would like to- I like to follow the idea that there's a growing trend towards adopting privacy tools, and people taking that into their own hands, but again, as you know with this big, big companies and the way that this manipulation started... I give you one example, is that now, of course, because of the ruling and- with cookies and things like that, every web page that you visit, you- you accept the cookies for the tracking, or you don't accept the cookies for the tracking. And then some companies will present a plethora of options, which the consumer or the user will get confused with, and then will just accept it anyway. And I think we've got used to convenience and speed a lot, over really wanting to read through the details about what's being tracked. And I still think there's that element out there, there's a lack of transparency, and a... and I think that it's going to be a long time before companies, companies that are set up on data, are going to be able to change that business model. But, yeah, I see things starting to turn. I see more companies coming to Tresorit, for example, and taking- especially organizations that are dealing with personal identifiable information, taking more consideration about how they deal with it, how they manage it. Yeah, it's a- it's a tough question. But, you know, I'd like to think that there is going to be some moral responsibility or- on being able to restrict the amount of data that companies or third parties get access to in the future.
Ed: Yeah, I think it's- and actually, the (inconclusive) tactics you talk about- talked about are a very important point. I think we can't underestimate the staying power of established businesses. And then once there is a- and it's the same for any industry, this isn't just a tech thing, but once somebody has established themselves, they will fight tooth and nail to stay where they are. You can't- that's- you know, that's human nature. So, yeah, I can't- we can't underestimate how- well, it's not going to be an easy shift. And there are lots of tactics are there for keeping people in ecosystem as it were. But, I don't know, maybe I'm just a- maybe I'm a- a naive optimist. I'm a...
Paul: I mean, this is- this is the whole reason for the podcast that we're doing. And, you know, just to bring awareness, of course, getting the word out there about: there are other options. Because a lot of people that come to us, they didn't even know that we existed. They didn't even know that- that we've got, for example, zero-knowledge and end-to-end encryption. And I think what's happened- that it was the ability to be able to make efficiencies and convenience at very forefront of marketing efforts by large organizations, and left the security aspect way behind. And now that part of things is starting to catch up. And I think just because of the GDPR as one thing, it's starting to- starting to sink into people now, with some regulation about the way that data should be respected and treated, especially for individuals. And what I also see as well is that, I think more nations will start adopting their own privacy rules as well, as we go forward. I'm starting to see this trend already in South America. We know that Columbia and Brazil are trying to put something together similarly, based on GDPR. We'll probably see a few other countries follow suit around that as well, bringing in their own data protection laws. So... that could obviously present itself as another aspect of being a minefield, for companies to manage or- and deal with. But at least it's in- it seems to be coming to the forefront now. Which is good.
Ed: Yeah. I think it's- the- the regulation point and the way the laws are changing, is a really interesting area. I think there's going to be an element of sort of trial and error over the next few years. I mean, GDPR, for example, is a great step forward. But, you know, has its faults. But, you know, over time, these things get improved and people see the bugs - it's the same way with software: people sort of spot the bugs and they- they fix them. I really think the question of how data should be used or controlled or how privacy should be regulated... I don't think it's going to be fixed overnight. I think there'll be an element of trial and error, and there'll be a mixture of companies coming forward with different innovative products and regulators properly managing the industries and understanding the impact that's- this free exchange of data can have. Yeah, I think we'll just have to wait and see. And- and hopefully the public comes with us.
Paul: Yeah. Well, I hope that they will be on board with us. So Edward, thanks a lot for- for coming on today! I really appreciate you- you joining us. And at least, you know, we know, we- we as both organizations are there for the- for the- for the most important aspect, which is the customer and the customer's privacy. And that's what we're both standing for. So it was great to- to have you on the show and be aligned on the same values. Absolutely great! Thanks a lot!
Ed: My pleasure. Thank you. Thank- thanks for having me!
Paul: No problem. You have a good day now, Edward! Thanks a lot! Take care!
Ed: Yeah. You, too!
Paul: Bye, bye!