Brave: redesigned browsing and advertising experience – an interview with Luke Mulks

Brave: redesigned browsing and advertising experience – an interview with Luke Mulks

It used to take traditional surveillance services years of to understand a person’s behavior and motivations. Now, it only takes a few web interactions for tech giants to map out a detailed profile of our digital identities – and the worse thing about this is that we readily give the  information that goes into this (through Google searches and cookie policy acceptances) for free!

In our next privacy talk, we’re digging deep into the algorithms that keep the Internet ticking over to unravel the methods behind invasive data collection, and exploring the privacy browser alternatives that sit outside of the influence of Google and co.

With this in mind, we invited Brave’s Director of Business Development Luke Mulks to appear on under CTRL to help reimagine our digital futures.

Brave: a revolutionary web browser and ad platform

Having started his career in advertising, Luke is one of the many marketing professionals who witnessed the rise of tech monopolies and became alarmed at the industry’s approach to data and targeting.

Luke describes Brave as a “privacy browser with the first kind of integrated global private ad and reward platform”. By blocking trackers that allow platforms to create detailed audience profiles, Brave provides users with a “real clean start” and a browser experience that’s 3 times faster than Google. With over 20 million users (and counting) worldwide, their rapid growth is also indicative of the market’s readiness for a private web browsing solution!

Bringing “Trojan-horse good privacy technology to the mainstream”

We had a blast discussing web browsing’s complicated relationship with privacy – see below for an overview of the topics we touched upon during our talk:

  • Luke's opinion of disruptive advertising strategies and the shift from simply collecting data to offer relevant content to more sinister methods of detailed user profiling
  • What the reality of a Western type of “social credit system” would be that could be taking advantage of users’ data to judge their behavior and trustworthiness
  • What Brave’s pro-privacy advertising and reward model looks like and why it is more effective for users and marketers alike
  • Why Luke believes that – apart from regulators expanding the reach of data protection globally – bringing the best privacy browser and powerful privacy technologies to the mainstream is the way to oppose the current large-scale data exploitation

If you’ve enjoyed listening to this episode, follow the under CTRL podcast on Spotify for regular interviews with privacy thought leaders –  our episode with encryption legend Phil Zimmerman is a particular highlight - and don’t forget to share your feedback and stay connected with all things Tresorit through our official Twitter and LinkedIn channels.

Show transcript Hide transcript

Paul: Hi everyone! Welcome to the 12th episode of "under CTRL". My name is Paul Bartlett, and on today's show is Luke Mulks, who is the Business Development Director at Brave. Brave is a privacy-focused web browser provider, with more than 15 million users. Their browser experience is three to six times faster than other mainstream browsers. We will discuss why invasive data collection is a significant issue, and how Brave solves this with a privacy-focused mindset, in addition to incentivizing content creators. Welcome, Luke! How are you doing there?

Luke: Doing well, doing well. Thanks for having me on!

Paul: No problem. Thanks for joining us today. So Luke is- Luke Mulks. And he's the Director of Business Development at Brave. And we got- typically start in the usual fashion, where we're going to ask Luke to give us little bit of background about himself. Off to- over to you, Luke.

Luke: Yeah, sure. So I basically have a background in- my- started off in publishing and start-ups, and then I moved into digital advertising. For many years, I was the Director of Ad Products for a company called OAL, and we were right in the middle of the belly of the beast. So we worked with Google and all the other ad stacks, but we also worked with the biggest media companies on the planet to handle ad product integration and ad operation. So we kind of saw how all the sausage was made, as far as, you know, data collection and the operations and process for all that stuff. And I just- when I started working there back in 2010, 2011, it was a different game with advertising. And I watched that shift from being pretty simple and minimally invasive to being extremely invasive, and Google really taking the reins and driving it through the roof. So I saw what Brave was doing, back in 2016, right around February, March. Right when they did their first- right after their first initial, you know, kind of early, early data release. And it seemed like it was the first project of the kind to kind of not only think about really strong privacy, but how can we make the internet work with- you know, how can we make sure people get paid, how can me make sure that there is a steady, healthy economy within the internet? So I made a kind of crazy thing, and I went from kind of running ads on the Super Bowl to joining a risky start-up with a few folks. I think I was like the eleventh or twelfth employee there. But it was- you know, when you have the guy that created Javascript as a co-founder, and people like Yan Zhu on our security side, it's kind of hard to say no to that opportunity.

Paul: Yeah. Yeah. And I'm sure that wasn't the only motivation. As you mentioned, the invasive or the intrusive manner of advertisement, so there must have been some other motivation for you there as well, right? To jump?

Luke: Sure! Yeah, yeah. I mean, it was- it came to the point where, you know, everybody's kind of got their own personal checklist of what they're comfortable with doing in their life. And toward the end, I was really uncomfortable with what I was seeing. The- just the amount of data collection and how it was all kind of being pooled together with all these companies.

Paul: Right. Yeah. So and then out of that came your introduction to Brave, as you mentioned. And so tell us a little bit more about Brave.

Luke: Yeah, so Brave's a privacy browser with the first kind of integrated global private ad and reward platform. We take the Chromium Open Source browser code, we harden it, we remove all of the- anything that would phone home to Google. And when you do that and protect people's privacy- so anything that would collect our data without them being- you know, from a third party, without them being aware of it, is something that we block and protect our users from. And we do that by default. And it's probably the most strict in protective privacy policies of any browser on the internet right now that has this many users. And so, when you start with that foundation, and you give people a real clean start with their privacy, then you can kind of build in features and lines of business that are privacy-preserving on top of that. And that's exactly what we've done. And we've grown- we're at about 19 million monthly active users right now. And we're at about almost five million daily active users. And a year ago, we were at about seven million monthly active users. So the growth is quite strong. And I think it's really indicative of a need that people have for privacy protection that has been overlooked for far too long.

Paul: Yeah. And what do you think the drive is behind that (inconclusive), because, I mean, we've seen a few instances over the years, Edward Snowden and then, you know, there's- obviously the most recent ones are the Cambridge Analytica, and now we've got The Social Dilemma, which is out on Netflix as well... are we in an era now of realization that data is being abused, basically? People are coming out of the woodwork and starting to speak up.

Luke: I think there's like been a real natural progression, where, you know, you look at the timeline that you just mentioned, where you've got Snowden, where it's- first it's like- people like Snowden used to be called "crackpots", right? Just for even suggesting that governments were doing that. Then you have this- ironically, right as Snowden becomes- you know, hits the scene, you have this big data business boom, where all these companies kind of turn into their own data warehouses. Right around the same time. So then you have this sense of like, "Okay, the government can get into anything", but now it's like you've got all these mini NSAs everywhere, collecting all of this data. And then, you know, you see what happens with that data collection, when you start seeing all these breaches happening. And now- I think now we're in this phase where, okay, we've seen what the scale of data collection is. Now you're starting to see, "Okay, how are the algorithms affecting everything?" You know, these algorithms are using this data, and how is that affecting, you know, the zeitgeist? And that's kind of where we're at right now with The Social Dilemma. But there's a big awakening. You see Apple talking about privacy. You see others. And GDPR was just a huge kind of catalyst for getting this up. And I think that's kind of the way you have to look at this, is like: How does this impact business? Because everybody wants to- everybody needs to earn money and earn a living, and you can really have a great impact when something with teeth is put out there, in as far as regulation goes with privacy, and people are willing to enforce it.

Paul: Yeah. Yeah. And of course, we're not just talking here typically about the individual users. I mean, there's companies out there that are also using browsers as well to search for products for- and how that's been- you know, the advertising mechanism behind that and- to the lowest bidder, the highest bidder. So let's go on with that a little bit. Let's look at this invasive data collection, as you mentioned. Tell us a little bit more about that. How it's evolved from when you first started it, back in all those years, to where it is now. You mentioned Big Data, of course. How is that all coming together now? What does that look like?

Luke: Yeah, it's a scary picture, to be quite honest. So we went from a time like- even back in 2010, where mobile was first kind of starting to take off. But it was really- mobile was trying to adapt what desktop was doing. But back then, you know, people were kind of in a fixed location. You might have a tracking cookie here or there, but it was more about what the site was and what you were doing on the site. It was very contextual. What really changed was, when programmatic advertising took hold. Google bought this company called AdMill in 2011, after they had bought DoubleClick in the late two thou- or early, mid 2000. And when they bought AdMill, they could leverage everything that they had from search, everything that they had from DoubleClick, and then this really, really sharp programmatic auction platform that was already in place with a big network. And so it went from a situation where a site was collecting some data, and it was trying to serve me something that was relevant to the content, to "we are going to track people, and audiences are the target. It's no longer about what you're seeing, it's about who you are and what you're doing online." And once that happened, you started seeing this rise of like new vendors that are doing different types of tracking. Where before it was like a tracking cookie, now it's like a tracking cookie, location services, and beacons, and offline tracking where, you know, Google Analytics could actually, you know, report back from brick and mortar (inconclusive) and other types of offline tracking. And it got to the point where- I remember one day, it was back in like 2012 or 2013, I could figure that the second I moved out of my house and got into my car, I was basically being tracked, all the way to, from the East Bay to San Francisco, and then pretty much throughout the whole day. And the thing that has happened is like... you go to a web page and it got so bad that basically every page you go to is broadcasting your data to a bunch of different companies that you've never heard of that are all bidding on YOU. And they're doing it for every ad slot that you're on. And these companies all have profiles on you. So if you think about, you know, you could have up to twelve companies on one website, right? And you're traveling page to page. What page you're on is going- broadcasted to these companies, what interests you have. And it's all kind of being compiled about you. So what they have is a bunch of different partial data profiles on you. And then, you know, over time, this builds up, and this data gets aggregated and bought and sold in the markets that you've never even seen. Right? Like that's the important thing here, it's this whole huge marketing landscape that being created and this multi-billion-dollar industry, where even the fraud is round 16, 18 billion dollars a year. This is all made of your data. And you're getting nothing for it, except for some promise about, you know, relevancy with advertising, which I don't think I've ever heard people really ask for. And the degree in which the data is collected and how much of your information is being broadcast, I mean, this is where it gets kind of scary with Google and companies like this: It's they kind of made themselves into this guardian of your data, while they're also basically, you know, exploiting that data for maximum gain on their side. So it's a- this picture is pretty bleak. I mean, even now it's getting more and more- people are more and more aware of what's happening with privacy and with their data collection, but still - it's still pretty egregious. And in the US, where we have such an impact on, you know, the global digital economy, there's still a lot of uncertainty around what the privacy situation is going to be like here, because there's so much interest from, you know, lobbyists etc.

Paul: Yeah. Yeah. I just wanted to run this one past you, Luke. It's when you mentioned about tracking and just an experience of mine recently: I went into a sports shop here, recently. And bought a resistance band, for working out at home. And within, I would say, probably ten or fifteen minutes, I looked at my phone, looked at Instagram, and it was there. And I never searched for it. I never looked for it. So it kind of freaked me out to the point that: Is it just by chance that happened? Or what I just bought inside the shop, when I went through the till, and I paid for it, that data came collected and accumulated together and then suddenly, I'm getting targeted ads. Whether I actually mentioned the word "resistance band", which- we know that, of course, these devices can listen to key words and things like that for advertising, but I can tell you: It's just more and more starting to spook me out a little bit. So... yeah.

Luke: And it's really sprawled, too. It went from, you know, okay, advertising companies are collecting data, and Google used to separate this out. They used to say, "Okay, my Gmail data, my other data - Google Docs, other Google data - would-" there'd be a barrier between that data and data that was collected for advertising. Well, in 2016, when ProPublica broke their story, they kind of quietly changed their privacy policy to remove that barrier. And they redesigned their APIs, to basically work together. So that data that's collected for advertising or for mail could be used for advertising, or vice-versa. So that's when it really- it really took off. And that's also right around when you saw just this meteoric, just, you know, boom and bust of advertising. Like there's been a lot of disruption within advertising, because Google is taking so much market share. And that's still- that's the ironic thing about things like GDPR: Google can go and fight in the courts forever. They have enough money to deal with all of these- any lawsuits. But the other companies in the advertising space were just getting crushed. So what you'll end up seeing is like a stronger Google and a stronger Facebook with this. But I don't- that said, it's not a bad thing. I think that, you know, these- it puts it more front and center, and it lets you kind of really have these companies start having to answer for these- you know, for these issues that have been going on for a decade or more.

Paul: Yeah. Yeah. And I think, on that point, when you talk about the likes of Facebook and Google, and they've collected this information about you for so long, and a lot of people have just got used to it, and they say: "Well, so what? What does it matter?" So maybe, Luke, you can take us on a journey of: Is it harmless that they just collect this information about us, that we accept all the cookies, and they feed us content, which they think is relevant to us? Is there any harm in that? Do we see any harm in that?

Luke: Yes! I think that's been the really kind of- the thing about advertising is: They- people in the space, in ad tech, will make an argument that says, "Well, this is anonymized data," and, "We don't know you. We're just trying to make sure that we're not giving you adult diapers or something like that with advertising." And the reality is that it is invasive. Your- like every ad request that goes through Android goes through the Google Playstore. And this the amount of data that's collected on your device: You're using it for navigation, you're using it to connect to your home wi-fi, to connect to entertainment... it's your hub. And right now, it's a free-fire zone for data collection. And it- not only is it about tracking who you are, but then that data can be used against you. And I'm not sure how clear this is in Europe, but in the Unites States, you know, there's more and more stories about credit bureaus and rating agencies basically buying data from advertisers, or that was recently collected from advertisers, and using that to impact how they score you when you're trying to get a credit rating and a credit score.

Paul: Right.

Luke: And so, you know- and they can claim all day that this data is not, you know, invasive or that they're doing, you know, the right thing, but the reality is: You know, when you get sick or you do something- you're not feeling well, you search for it on Google. And they're not doing anything, you know, to separate that data out. Right? And like these companies, you know, the practices- I mean, working with these companies, the problem that really happened was, and what's kept other solutions from differentiating or innovating, has been that there has been such an incentive for as much data collection as possible for as long as possible that there has been no business case, you know, ever to really clean up the act and do something differently. And that's basically what we're trying to do at Brave. And that's really kind of what motivated me to take that risk to leave, you know, a pretty cushy gig to go over to this new thing. Because, you know, nobody was doing this in this way. Yeah.

Paul: Yeah. Yeah. And I think what I'm fearful of as well is - (coughs) excuse me - that we had a lot of media attention focused on China, and about basically grading people in society about their behavior and the way that they are. And I think what you're indicating, in a way, is that, well, that is also being possible- that's also possible in Western society as well, that this can happen.

Luke: Yeah.

Paul: It's just not so much in your face. It's basically: If you go to an insurance company, they might have a wealth of data on you already that you don't know about. About your lifestyle, about the way that they think that you could potentially be a health risk, without- and making, you know, your- the premiums for your insurance, for your health insurance or whatever that may be, exceptionally high. And yet you don't even know why that case is. And yet they got all that data ready to hand.

Luke: Yeah, and there were also cases around, you know, price discrimination and things like that, that were- are being alleged. I think, you know, that's the big deal here. It's that you go to these sites even and they have advertising, right? So they can attribute an ID there, they can back it up. But there- there was a new type of company that started emerging, you know, several years ago, called "Data Management Platform". And what they did was basically kind of work as a big kind of collection house and collation service for all of these different data sources. But what they could also do is go to data brokers and buy offline data, so purchase data. Things that they can tie together more. And the thing that really got scary was that, you know, by 2016, you've had this happening for years. And the amount of history that can be tied to one person- and people will say, "We're using anonymized IDs," or whatever, but your driver's license has an ID on it. An ID number. Like and that doesn't necessarily say your full name, but that's attributed to your name. And what happens that people don't understand is: Even if a company does say, "We process this, we throw the records away," other companies- they're sharing that data with other companies. Those companies are creating a new ID, photocopying- basically photocopying that information into their own database, and then, you know, persisting that information longer. So even when they say that they're doing the right thing, they don't really know. And that was- the interesting thing about GDPR was, when that came out, no one really took it seriously in the US, and then, all of a sudden, all these US companies- there were newspapers that were basically gating off content. If you were in the EU, they would just prevent you from seeing their site, because their legal teams came in and said, "Hey, we're still trying to figure out, okay, where's our line in the sand as a publisher collecting data, but oh my gosh, what is going on with these advertising, you know, with this advertising piece over here?" And that, even within these companies, you've got maybe one or two folks on the development side that understand how advertising works. But even the developers building their publishing site don't necessarily understand, you know, how far-reaching this data collection goes. So it's been really eye-opening, I think. And we're still kind of in that phase. I think if you look at like Alexa and, you know, Google Home and all these things, there's listening all the time. Your phone can listen. And the problem with these things is that, you know, Facebook can have one policy. Facebook Messenger can have another policy. Instagram can have another policy. And they're all kind of owned by the same entity, but they can shell-game it around to where they can say one thing and do another.

Paul: Yeah. So... I think what we're leading to is: You have got like a literally- a digital personality, profile of you, online, about everything that's happening to you. And that's being traded. Right? That's being traded to potentially the highest bidder. So there's a market out there for us. For our digital way of life.

Luke: Yeah. And the way that it worked was that basically, you know: Whichever company that was bidding that had the best data on you- or whatever company had the best data on you for the opportunity would know what the best price was to bid against. So they would know, "Okay, this is a good price for me to bid at, this is not a good price for me to bid at." But what you started seeing, and this is the thing with advertising technologies, like: These aren't dumb people. Like these are very smart people, and they like to gameify these things. And in- for what it's worth, I mean, like the rules are pretty- there's not a lot of rules around what you can and can't do. And that was the other surprising thing and the other thing that really got me, you know, interested in leaving to do what I'm doing now, is that, you know: We have to build our way out of this. It's not going to, you know- the regulations help to a degree, you know, they bring awareness, but there's still and open question around enforcement. Because if you throw a nickel dime, tiny lawsuits at these big tech companies, they don't care. They will put that out. And even if they lose, they can still win, because they can just not pay people and then take it into arbitration for years, right? And so they're such a- the scales are so tipped in the wrong direction that the only way you can do it is to kind of figure out a way to Trojan-horse good privacy technology to the mainstream. And that's kind of what we've been doing. We can- and that's what's been interesting about this journey, is that, you know, we don't collect data on people. So we don't even necessarily, especially early on, didn't have any idea who our audience was. And- other than, you know, what we've seen from, you know, open source code or, you know, social media etc. But, you know, over time, this gets to be a huge cohort. And we learned that you really got to just play up the value like, "Okay, you can have a three times- three x performance increase from blocking all this stuff. You can have a faster experience. You can have a cleaner experience. You can have better battery life. You can have less data on your bandwidth for your bill. You can do all these things that- while protecting your privacy." And that's kind of been the angle that's worked really well for us. And, you know, how do you kind of- how can you side-step a lot of the eye-glazing, you know, really complicated mess? And it's just by communicating, "Well, what are the really good side effects of this?" Right? Like... and then like also kind of breaking their ice and saying like, "Look, Amtech will say this is all very- you need all these complicated things, but if you really break it down, what you need to do is match a business or a service or a thing to a person. And if you can do that effectively on one end, with all of this data collection, or on another end, with none of it, and none of that liability or risk, I think the one that has the less risk and the less liability and the efficient way is the one that'll win." And so that's the kind of- that's the case we're trying to prove at Brave.

Paul: Yeah.

Luke: It's that: You can do this without having to collect all this data. Where- even though the industry is saying, "No, you need all this data." Like that's kind of where we say, "No. I don't think so." It's 2019, 2018, 2020... we can innovate. We can do things locally on the device with the data that's there. And that has been kind of our focus.

Paul: Yeah. I just wanted to touch on a point which you mentioned with GDPR. Do you think, personally, from your perspective, that- you said it's a step in the right direction, which might still suggest that there's still a way around it, there's still a loophole there for companies to be able to exploit somehow, some way around that. Whether they're storing data or collecting data from a third country, like a developing country, which doesn't have such strict data laws, which is not effective. What do you see around that?

Luke: I think, you know- I think that there's- GDPR is really- it's really great. I don't think that it's necessarily dead or that ineffective. It sets some good base rules. The real question is around how is it enforced. Right? Because we- Johnny Ryan, my former colleague now- at Brave, he's now working now in Ireland, specifically on these issues. We took a complaint to the Data Commissioner in Britain and basically said, "Look, you know, every time a user goes to a web page, their data is being broadcasted. There's no way that this is compliant with GDPR, no matter how much the industry says it is." And the regulators actually agreed with our complaint. And they, you know, validated that in writing and published a post about it. But then the problem was that even though they evaluated that this was incompatible with the law, when COVID hit, we saw messaging from them that said, "Well, look, you know, we're really not going to go overboard enforcing things that are going to negatively impact an industry at this complicated time or whatever the-" Some can- kick down the can- kick-the-can-down-the-road kind of excuse. Right? And so that's where- if there can be some enforcement from the regulators- and the regulators are waking up about this. And that's when the interesting thing- even in the US, where, you know, Johnny testified in front of the Congressional Judiciary Committee and the Senate about this, and you saw these regulators in the US, at least, you know, Congress was sharpening their knives around what they can do with these companies, because they're impacted by this, too. And that's the thing that's interesting about GDPR and about privacy regulation with politicians and regulators is that: Every political season, at least in the US - I don't know how it is in Europe - we are bombarded with political advertising. And so it's an area where they're actually- they actually have to kind of, you know, pay attention. And it impacts them directly. And so there's a bit of a win there. But I think, you know, it's still very much a dog fight between lobbyists for the industry, there are- have been, you know, padding their wallets for a very long time off of data. And then also racing to get innovative technologies to market that can replace these things. I think that's the real thing that matters right now. And that's what's interesting about like Tresorit and all these other companies, too, where it's like if we can all kind of work together on really pushing innovation, and really building a market for this, then there is an alternative that people can go to. That's been the hard thing. Is that there hasn't been. Right? And so there's been no incentive to do that. But now there is, and now we've got some of the biggest brands working with us, and, you know, experimenting with this, because they just want to differentiate and be part of this privacy way of what's happening.

Paul: Yeah. And I think- yeah, to your point there as well, that it is about privacy, and people taking their privacy back. When they've never had that opportunity to do that, because the technologies had not existed there. As much as we do security by design, I think now there is a case in technology to do privacy by design as well. And I think that's what ourselves- and you guys are on that mission as well. It's to give that privacy to the customer, because that's their fundamental right to have that. And, of course, you can sign up and you go visit as many web pages as you want, accept the cookies, but people are just not aware that you could- there is an alternative way to googling it, right? That you can brave it.

Luke: Right.

Paul: So you can bring it out there. So tell us a little bit more about what's the alternative then? And... here's Brave. Okay, so. Here's Brave. And what's the alternative, and how are you guys making it work for you, surviving?

Luke: Yeah, yeah. So Brave's- like I said earlier, Brave's a privacy browser. And- but we're more than a browser. We're a software company. And we see ourselves as a platform. We're almost like a super app, where, you know, if we can start with that foundation of really strong privacy protection by default, and not waiver with it - and that's the other thing: It's that, you know, we've gone from, you know, a small handful of people to like over a hundred on staff. We've gone from, you know, a couple of hundred users to nineteen million. And we've only gotten more stringent and diligent about what we protect around privacy. And we haven't caved. And as long as we stick to that ethos, it's important. But you start by doing that. You make it more efficient to browse. You make it cleaner to browse. You make it a better experience. Right? And that's what you get with Brave. You get- you get a browsing situation where your content isn't fighting against the advertising. And if you try Brave for like a week, and then go back to Chrome or whatever browser you were using before, it is a noticeable difference. And you don't realize it at first, but after a while, you know, you're like, "Gosh!" You know, like, "Something's missing here. Something is different." And then you go back to Chrome, and you're like, "Oh, yeah! My content is- has nowhere to breathe here." And so... you know. And that's the main thing. I think, you know. And then the other thing we did was: We had a- we'd been really into privacy for the mainstream and also like the blockchain technology. And it's been interesting, because you've got two of these very complicated topics, both kind of, you know, emerging at a similar time. And so we had a token sale. We made our own blockchain-based utility token called "The Basic Attention Token".

Paul: Okay.

Luke: We sold that in May of 2017. And the whole point of that is to kind of become a unit of account for attention within the platform. So if you go to a website now, you know, in advertising, the way it kind of works is: You've got, you know, advertisers. They have a bunch of companies that track users and publishers. And then you've got, you know, publishers that track the advertising and the users. And all these companies are basically measuring the same thing, 'cause they don't trust each other. And where- you know, our aim with the token is to basically say, "Look, we can make this available on a publicly- public blockchain." You know, the data that, you know- the measurement information that, you know, that is being reported. And then make it so that it's accessible. So where it's not linked to any person, you know. We remove that link and make it truly anonymous, like... And then, you know, provide a means for accounting, so that you don't have to have five or six vendors measuring the same thing. Everybody can go and look at a single source of truth. And have high integrity there. And then have high integrity with the browser. Because doing all this with Javascript - and it's ironic, 'cause, you know, our co-founder created Javascript - doing all this with Javascript is not- too low integrity, too many people have been able to take advantage of this, like ad fraud has been... the whole industry is kind of built on top of ad fraud, right. Like you can go online and buy traffic for your website, if you want to. Even now. And so, you know, we- if you really want to take this seriously, you have to take it to the- use high integrity, and you have to use, you know, kind of that source code, you know, the- work directly from within the architecture instead of like trusting all of these untrustworthy sources.

Paul: So there's the new internet there, right? That's what we're talking about. The new web browsing experience. And bringing those benefits you just mentioned at speed, is one (inconclusive)?

Luke: Yeah, well. Speed, battery saving, data savings... the other thing, you know, that we're doing with the token and with our advertising model is that we've set up a new type of way of exchanging value, right? So advertisers buy privacy-preserving advertising on our platform, and we created a reporting protocol that is able to report on, you know, an ad was viewed for a campaign, an ad was clicked, a purchase was made, yada, yada... None of that links back to an individual user. And so that's a really innovative thing that we've done. And we've changed the way that ads are matched. So you have a browser, where the browser knows what you're doing, and that's your local data. Just like your browsing history, right? Like and you have a full record on your device. So what we did is: We shipped the matching technology into the browser. So it doesn't have to go to the cloud anymore. It all happens locally, on your device.

Paul: Right.

Luke: So you put a really way- smart way of matching ads on a smart device, and you can do it more effectively, because you have a full profile that you can work from. But the other thing we do is: We reward you, you know, for- you view an ad with Brave, you opt in to advertising that's privacy-preserving. And you earn seventy per cent of the ad revenue in our tokens, so users are finally getting value for their attention. And breaking it all down to the most simple terms: We win here if, you know, maybe one out of ten users has a aha-moment, where they realize, "Wow! My attention does have value!" And an even smaller cohort of that says, "Well, wait a minute. Why haven't I been getting anything for that value elsewhere?" Right? And so that's the other thing that we're doing here. It's trying to create a new way to, you know, allow people, publishers, you know, advertisers do business online that doesn't really require all this, you know, horrible data collection.

Paul: And is that changing the cycle or the mentality of the actual advertisement companies as well? I mean, you've- you're a big- you're growing, and you get- onboarding advertisers. And they're believing in the same thing that you guys are believing in? They've seen enough of the old model and they want to move into this new world that you're creating?

Luke: Well, I'll say... it's been a- it's been an evolving story. I think, you know, it was really- when I started here back in 2016, we would go into meetings and talk about privacy in advertising, and people would kind of look at us sideways and not really know what to do. But then, you know, when blockchain technology became popular with marketers, and also now with privacy becoming, you know, front and center, there's been an increasing interest in- even big, you know, holding companies, you know, there are five or six of them that handle ninety per cent of advertising, right? Like they are realizing that- you know, right now we can start working with these people, test it out, get a feel for it. The ones that are really eager will- we work pretty close- we work closely with them to make sure that we're shipping features that they need. Because, you know, it's a competition between them spending their ad dollars with Brave, or with Google or Facebook. So we have to be able to not only protect privacy with all of this, but also be competitive. And the advantage we have with that is, you know, if you have a machine-learning model in the browser that's working with your local data from the browser level, it has a much clearer picture of what your interests are, what you're trying to do. Is this person intend to buy? Is this person just trying to research something? And so what we put a lot of care into is making sure that when we serve an ad, it's separate from the content, so it comes as like a notification. And then also, we try to make sure that if it seems like the user is reading or, you know, in the middle of something, that they're not disturbed. But our thesis is that, you know, we can show a user between one and five ads an hour, instead of up to five ads per page, which is what happens outside of Brave. We can show them up to five ads per hour, and have it be more effective for marketers, because we're using this technology, and because we're working with a better set of data. So that's the part where once they start to come in and play with us a little bit with this, they see like, "Okay, there's a funnel here, I can actually see people buying. Now you can see that things are leaning to purchases. Okay, I'm feeling comfortable about that." Because the thing that's interesting with advertising is that despite all of this data collection, when you look at how marketers look at the data, they don't look at it on the individual level. They look at it in aggregate.

Paul: Yeah.

Luke: And so that's kind of where we're coming from. It's like you don't have to have personal, you know, information to do that. It just has to be that it'll effectively perform and match things. And by having ads be opt-in in Brave, the whole audience that sees those ads want advertising. Like they are okay with an experience that has advertising. And that's the other big difference here, is that when you go from an opt-out experience, where everybody is shotgunning you with ads all the time and just hoping something sticks, and trying to measure whether something even was a view, to an experience where a hundred per cent of the people that see your ads want an advertising experience, and they have value and seen that. You know, our engagement rates are way higher than industry averages. We have like, you know, 6, 8, 10% click-through rates, things like that, that just- it seems really crazy. But when you have a focused audience that wants this, and you respect their privacy and value them in a new way that, you know, is closer to what the original internet was kind of hoping to do, you know, it yields results. Like and- we were working with big agencies and holding companies, and little brands, too. Like we've been working with content creators and having them, you know, advertise with us. And us- we can show them that we can drive audience to them and get more people to view their content and, you know, kind of discover their work. So, you know, there's a lot of different areas that we've been going after with this, but, you know, so far, it's been pretty good. I mean, you know, we've been out for a year and a half with ads. You know, we've worked with some of the biggest agencies and holding companies on the planet. And, you know, the opt-in rates are around between- twenty and thirty per cent of our audience is opting in to advertising, you know. And we have goals to kind of bring that up and work with rewards and, you know, try to give people more what they want. So, you know, it's all about kind of breaking that ground initially, and then getting it on every platform, and then getting it in as many countries as you can, and then, you know, kind of building it up from that. And that's kind of the phase we're in now, it's building up from that.

Paul: Yeah. I mean to- a couple of points you mentioned there is that I as a user as well, going online and... I want to take control of what I can- want to see and what I don't want to see. And given- nobody gives you that capability until now, of course, and- with the technology such as yours. But there is one of those things. For example, you get something for free, you get Instagram, you get other applications for free - but you're not getting them for free, because, as you've mentioned, that's being sold off. It's you- is basically your interactions with it. And not only that. You can't reuse the application effectively, if you're not willing to accept the terms of use. Which is normally deeply embedded in the terms of use is that they want access to your camera, to your microphone, and this is how these things behind the scenes are working, of course. Younger people, today, or I wouldn't say necessarily younger people, but there's certainly an age group where the generation of the internet came up was that, "We just accept everything. We accept everything." But would you say now that younger people are more security-conscious these days? More privacy-conscious? Coming through, a new generation of young people growing up, more aware of this?

Luke: I would say once they understand what's happening, they're much more upset about it. They're much more vocally active about it. I think, you know, the funny thing: We would go into meetings, and people that were skeptical would say things like, "Well, younger people don't care about their privacy. They're on social media. They're broadcasting all of these things to all of their friends and family etc." And- but really, like if you asked those people if they are cool with people looking at their private messages and their DMs, they would all say no. And they expect a level of privacy there. So, you know, privacy is an inherent human desire - right. And I think that, you know, the challenge for us with people, especially in the States, has been like: How do you communicate what privacy means, what data means? And really a lot of the time it just kind of goes down to basics, like, "Do you have window shades on your windows at home?" And, you know, "At night, you know, do you leave them open?" Right? Like, well, if you look at it from that angle, every time you go to a website, it's like leaving your windows open at night with the lights on in your house. Right? Like and so, you know, what we've been trying to kind of like say like, "Look, we've got this window shade here that can let you do what you want to do anonymously on the web, and not have all of this other stuff getting in the mix." And young people resonate with that. And I think, you know, we did some studies with- we did a study with the University of Oregon, and- with a marketing class, and we went in there. And one of the things I told them after was basically like, you know, "Out of the ValueOps that Brave has, you know, rewards, speed, performance, data savings, and privacy - like which of these resonate the most with you?" And beforehand, I kind of anticipated like everyone would say rewards, because people like getting things for free. Like or getting things, you know, rewarded to them. Or maybe, you know, I don't know... It was more around the rewards. Over half the class raised their hand for privacy. And that blew me away. 'Cause these are 18- to 21-year-old, you know, students that all feel passionately about this. And, you know, their task was to basically, you know, take a 500,000 dollar budget and tell us how you would market Brave for a year to college students. And they came up with some really clever, novel ways of doing it. But it all broke it down in very simple terms. Like, "Do you want people to know what you're doing all the time? Or, you know, do you expect some level of privacy?" And they all think that this stuff is just way more private than it really is. And so, once they learn about it, you know, or, unfortunately, once there's a breach where they're, you know, impacted by it, they have to learn it, right? And that's been the other thing that's been an educating tool is, you know: People- identity theft is very real. And when you go through it, it's really horrible. And you got to deal with, you know, the financial impact. But that's the thing, that's the difference now, from now compared to a few years ago, is that you've got purchase data and advertising data kind of coming together. And that makes it that much more real. Because people are out there. It's livelihood, you know. It impacts your livelihood. And so, if your credit score or that home loan or, you know, something is being impacted that you need because of this data collection, that's not really fair.

Paul: Yeah.

Luke: Like and that's not what people think is happening. So that's kind of where we're coming from. It's like: That shouldn't be the- that shouldn't be the norm.

Paul: Yeah. I'm just going to take us forward. Because I think we've got about five minutes left. Just to get your feelings about- I don't think you've already mentioned this, but certainly, going towards the future... what do you see from the ad space and from, you know, the technology that you're providing with the web browser, well, let's go beyond the web browser to a platform, right?

Luke: Right.

Paul: Where people can engage with the trust and the knowledge that Brave is- has got their best interest at their heart, as a consumer, and the advertisers on there are respectful. I mean, are we going to go into- hopefully into a world like that, into a direction like this? Is this the future now?

Luke: Yeah, that's certainly what we're trying to prove, and what we're trying to do. And I think if you look at our growth rate, you know, going from zero to one million to seven to fif- nineteen million, right? Like it shows there's demand for this. And I think, you know, for us, it's about- it was initially, like the first few years, about proving that we can make the privacy case work. Now it's about like, "Okay, how do we take that case and make it competitive with an industry, where they have not only been collecting data, you know, egregiously, but also have been doing it to make things more convenient?" So I think our focus now is like, "How can we go beyond the four walls of a web page?" Right? And, "How can you hook other services that you use into Brave that maybe you have used an app for that may have been doing things with your data that you hadn't realized? How can we kind of tie it all together in a single platform that is protecting you?" Right? Like because, you know, browser is the user agent, right? And everybody else's user agent should be fired, because their agent has been freeloading and giving away all of their data to everybody. And where we are is about putting up a shield and saying, "No! Not only can we start with the internet, with web browsing, but we can also hook in, you know, your home, your home entertainment. We can hook in, you know, you got Ring, you've got all these things where you have concerns that they might be stored in the cloud and something might happen... well, how can we adapt that to work locally, and give you what you're looking for?" So can- building convenience on top of privacy is like very important, because if we don't do that, then you can't get that mass scale that you need to really impact change and make things better.

Paul: Yeah. And I think, along with that, going into the future, you're also advocates of regulation. We've had a little, brief conversation about this antitrust case, and this purpose limitation. Maybe you want to have a few words about that, before we wrap up, as well?

Luke: Yeah. I think this is what's really interesting. In Europe, we had the situation where privacy- where, you know, privacy is very clearly defined. Personal data is clearly defined in Europe. In the US, that's not the case. But in the US, we have very strong antitrust laws that have precedence, right? And the interesting thing about big tech companies like Google and Facebook is they- you go- they used to- Google used to separate your advertising data from your other data. But when they, in 2016, changed their policy to mingle these things together and remove that silo, it goes into the space of purpose limitation, where, if you're collecting data about me for advertising, that should only be used for advertising services. That should not be used to take- get an advantage over another competitor in another service that you're promoting. And the thing is, Google has been comingling this data, and that's how they've become so big, is by coming in with a new option... Gmail, you know. They can go against Zoom with Meet... all of that's based in a central location. But if you look at the network traffic for these web pages, they're using like Playstore requests and things like that on a desktop browser. And like why would they be doing that? Like that should be on an Android device, right? So in the US, there's this antitrust cases being built. And the first, you know, documents for that have come out, and they're pretty interesting. And it looks like there might be some teeth with this. But we'll see... I'm always kind of skeptical around these things. 'Cause they take a while. But there- it's a pretty glaring case. When you have, you know, 90% of the search market, 70% of the mobile operating system market, and when, you know, 70 cents out of every digital ad dollar goes through your same company. Right? Like there- it's too big to fail almost. And, you know, in those cases, you've got to- there's definitely impact on competition. And it's holding up innovation. Right? It's like the oil companies or what the train- the railway companies used to do. Right? Like they can buy out the competition and shelve them, if they want. And that happens a lot in technology. So, you know, it'll be interesting to see what happens in the US. I think that we'll probably see more talk about antitrust in the US or impact from that. But the reality is that if you're a US business and you're doing business in a global economy, you have to care about GDPR, because people in Europe are, you know, impacted by that. And that was the other thing that was interesting about these regulations is that if your data centers are in another country, and your users are in another country, and you are in a third country, you have to care about what the data policies are for all of these. And, you know, there are over a hundred countries that have data policies now. And so it's a global movement. And so- I think with antitrust, we'll see impact in the US. And you could actually see something happen where, you know, they end up getting broke. Because the thing with Google, I think, you know, and just really quickly, like the thing with Google is that they're so massive, right? So you can have- you can have Chrome with 70% of market share with the browser with YouTube. And if they want to experiment with something that isn't part of the web standards, they can make an API that they use, and then they can put it in the market, and they can easily get over 1% of users to be using this API, and kind of force the stand- force it into the standards bodies, because it's in a service that everybody wants. That is kind of a real big signal of being too big to fail. And you're seeing it happen, where they're changing the way that, you know, ad blocking can work. And they're- you know, they're changing the way that people even use URLs. Like URLs used to be this- you would use a URL, you could block things by URLs, so you were doing this thing called web bundles, where it obfuscates all of this. It makes everything almost look first-party. And it makes it really hard for privacy tools to block out things that are collecting data, when the URLs don't even make sense anymore.

Paul: Yeah.

Luke: So... I think there's all these different impacts from these companies being so huge, and I think that's where you'll see the movement in the US, it's probably around antitrust.

Paul: Yeah. And just to that point as well, when you mentioned several countries, all the complexity with regulation. It's not just GDPR anymore, or the US. We're starting to see other countries like Brazil and some of the Asian countries producing their own data protection laws as well. So we could be looking at a very regulated and complex market in the future for data. So I'm glad that you mentioned that. And I think, going for the future, it could be quite tricky to navigate for companies, so...

Luke: Well, that's where- that's where one of those things where, you know, it almost becomes a competitive thing with the US. And that's a position we've taken is like, you know, when there's situations with this much uncertainty, and you've got other countries that are creating more certainty around it with regulation, it actually causes us to slip, because, you know, people here aren't really sure what they can and can't do. And we drive a lot of the economy. So it's one of those things where like having some policy that we can at least get a foothold in would be helpful, to help, you know, remove some of that uncertainty.

Paul: Yeah. Yeah. Well, thanks a lot, Luke, for your insights and everything about Brave. I'll certainly be probably making the switch some time soon, and getting away from that. I know there's a few new applications out there like yourselves. But I'm going to definitely give it a try. And I certainly encourage the listeners to give it a try as well. Certainly looking forward to seeing the performance increase. Thanks a lot for being with us! I wish you all the best with it. Hopefully we can get you on the show again some time in the future, and you can give us an update about how things are developing. And all the best!

Luke: Absolutely! Thank you so much! And yeah, just go to, and you can give us a test drive, and if there's any issues, just hit me up at, and we're happy to fix them.

Paul: No problem! Thank you very much for your time, Luke! Take care!

Luke: Take care!

Paul: Bye, bye! And that is all for today's episode of "under CTRL". You can find links to all our social platforms and to our guest in the episode description. If you like the show, make sure you subscribe and leave a review. Join me again in two weeks' time for the next episode.