DeleteMe: Disappearing from data brokers’ radar – an interview with Rob Shavell

DeleteMe: Disappearing from data brokers’ radar – an interview with Rob Shavell

Millions of spying eyes are following your digital trails whilst you’re surfing the net, watching videos and scrolling your phone, so any paranoia you might have about who these eyes belong to is well justified.

The ecosystem that drives this - from Google’s prolific data collection techniques to data brokers’ trading businesses and advertisers’ hard-to-navigate cookie policies – is hard to navigate, which makes it difficult for individuals to understand how their data is being collected, sold and exploited.

There are a couple of things that users can do to reduce their visibility, and removing private information from the internet is one of them. Such a cleanup exercise can feel pesky and time-consuming at first (the process involves trawling through databases and submitting opt-outs so your personal information will be deleted), but there are more straightforward ways to reduce yours and your family’s digital footprint – a topic we explore in our next under CTRL episode with DeleteMe CEO Rob Shavell.

When Facebook went public in 2011, Rob and his co-founding team decided to take affirmative action and build a privacy service that helps individuals delete their personal information from data broker sites. And so, DeleteMe was born!

Here’s an overview of what to expect from our talk:

  • How Google’s advanced business model, refined by algorithms and tuned up with a bunch of additional services, works to track and trace users and create hyper-accurate consumer profiles
  • Rob's predictions for the biggest threat to online privacy
  • What services like Spokeo, Radaris, Intellias and Acxiom have in common and why the US market provides the ideal conditions for data broker companies to thrive
  • How online data brokers constantly adapt to what Rob describes as a "game of cat and mouse"
  • And, last but certainly not least, how DeleteMe chases down the latest services to prevent individuals’ from falling victim to new data hunter sites.

Like the sound of what you’re hearing? Check out the last episode of under CTRL featuring online security expert Troy Hunt – and stay tuned for more under podcasts episodes on Spotify. You can also stay connected with all things Tresorit through Twitter and LinkedIn.

Show transcript Hide transcript

Paul: Hey, everyone! Welcome to "under CTRL". My name is Paul, and today's guest is Rob Shavell, CEO from DeleteMe. Many of us are faced with the difficulty of navigating privacy issues in today's interconnected and digital world. The ways in which third parties are collecting people's information is constantly changing. When someone googles you or your family, more than 40 data brokers are selling your personal information within the U.S. alone. Is it possible for people to remove this information? DeleteMe are on a mission to empower consumers to do just that. Hey, Rob! Welcome to the show.

Rob: Hi! Thanks for having me.

Paul: No problem. Good to have you on the show today. A very interesting topic to discuss indeed, around the data brokerage. So first of all, let's- typically, we get started with some background about yourself and about DeleteMe. So maybe I'll just hand it over to you, and you can give us some insights about yourself and how you got into creating DeleteMe.

Rob: Sure. I'm an entrepreneur. And about ten years ago, right around the time Facebook, a company most of us know and are pretty familiar with, was going public on the stock market in the United States, myself and a couple of other entrepreneurs decided this was a perfect time to do the opposite of what Facebook was doing. Which was basically taking all our data again, for free, and then selling it to advertisers and data brokers and things like that. And so we wanted to create a company that cleaned up your data and removed it from different data brokers and places where it existed on the internet. And so that's what DeleteMe is, and that's what DeleteMe does. It's a service where you tell us a little bit about yourself, and then you- and then our privacy experts go out, find what information is out there about you on the internet, and opt you out and remove all that information from all the sources that we possibly can.

Paul: So back then, you could see already that- what the future was looking like with the collection of data, yeah?

Rob: Well, as an entrepreneur, you try to live in the future.

Paul: Yeah.

Rob: But you try not to live too far in the future that you can't build a real business. And you know, one of the problems in the privacy business has been the business model. Because we can't sell data. We can't do advertising. We have to give enough value to our customers that they pay us directly for it. Because there's no other way that our customers would trust us. So that has been an issue, because we're competing- all privacy businesses that try to do the right thing have to charge their customer. Because Paul, as you know, if you're not paying for the product, you are the product.

Paul: Yeah, it's become a bit of a cliché right now that, but it's very, very true. And I think, just seeing that now more than ever. Okay, so it's basically where you go, in every way you're touched by some kind of personal ad, because they're building this profile up on you. So let me just ask you, before we move on, Rob: Did you ever think that it would get to this- to this stage it has right now? Where you see the likes of Google and Facebook getting dragged through the Senate, and all the furor about personal and private- or private information that we handed over. Did you envisage that as an entrepreneur? Did you think that we'd get this far? Or has it gone even further than what you expected?

Rob: You know, it's a great question. I wasn't- I wasn't a hundred per cent sure, but I'll tell you: One of my co-founders believed that it was going to be - and it will be - much, much worse than it is. And he- when we started the company, he used to say to us, when were having drinks together and brainstorming what DeleteMe would look like on a napkin, he used to say to us: "Data is like oil. It is toxic. It is a- it is a thing that all these companies are going to have to manage and pay to take care of. And it is a huge, huge problem." And I think we are just beginning to enter the decade where companies are going to have to understand that, and manage their data just like companies are managing the chemicals and the waste products that industrial manufacturing has to deal with now. So data will beco- will go from becoming just an asset that they can do anything with and make as much money from as they can to, in many respects, the opposite, where they have to take care of it and it will become costly to do so.

Paul: Yeah, and it's very much, when you think about oil and then the by-products, it's a bit like data as well. It's the by-products coming off of that- yeah, of your personal information. So we mentioned- I just mentioned there about Facebook and Google. And I think Google is obviously one of the biggest- the biggest coll- about collecting personal data, your searches and everything that you do. And so, do you know a little bit about how Google's doing it right now? About how they're storing your personal data?

Rob: Yeah. You know, we do know that Google is doing an incredible amount of artificial intelligence and machine-learning based on the data that they collect. And we know a little bit about the data that they collect, because we can see in the Chrome browser, for example, we have a tool, which is different than DeleteMe, called Blur, which you install into the Chrome browser. Which, by the way, in the United States has about a 93% share of the browser market right now. So a huge monopoly in terms of people surfing the internet, using Chrome, at least on their desktop computers. Obviously, on mobile, it's more of a fight between Apple and Google. But the point being is we can see all of the data that that Chrome browser is collecting. And it includes every site that you visit, how long you're on that site, every search that you do into Google, which is the default search engine, obviously, unless you switched to something like DuckDuckGo, which by the way we recommend.

Paul: Yeah.

Rob: And so they are- and they collect a ton of additional information, including potentially every keystroke that you put into every form field that you- that you interact with. And every login and password that you store inside the Chrome password manager. Your credit cards, the transactions that you do... And here's the real problem: As if that wasn't enough data for Google, they're providing a whole lot of other services that many of us use. Not just our Android phones. So the data on our location, wherever we walk around with our phones, but also increasingly our business documents. A lot of people use Google email now, Gmail, for corporations. Not just personal Gmail accounts. So every email you get, every account email message that comes into your inbox, all of your friends that send you emails, their email addresses. And they're connecting and- you know, for what it's worth: Google is an incredibly smart company that hires and pays top dollar for the best talent. And I can tell you, from having lived in Silicon Valley for many years, when you walk around the beautiful Google campus in Mountain View, California, the sun is shining, people are riding around on red- and green- and yellow-colored bikes, and these people are the smart- and they're having free lunch where- you know, whatever they want. It is a beautiful campus. And it attracts the best, smartest people in the world. And part of the problem with what does Google know about you is that these people are taking all that data, and they're analyzing it in probably the smartest way that you can- in the- the smartest people in the world today can analyze data.

Paul: Yeah. And you mentioned something there, which actually I'm watching a documentary on at the moment around AI, mainly based in China and what China- you know, with the tech wars going on, about taking the smartest people in China, and China being in this tech war with the U.S. What is it that you see that we got to fear from AI? When you say that Google use AI, or maybe some other- like Facebook uses AI. Is there anything we should be fearing? Is it helping us? Or is it potentially a bigger threat to us as well?

Rob: Yeah. It's a good question. And I think it's really a big debate right now in Silicon Valley. When you go have dinners with the people that run the technology companies there, they're sitting around debating whether AI will be a huge problem or not. I don't think- personally, I have a slightly different view on it than I think most people in Silicon Valley. They- I think the prevailing view is they believe that AI will take over and do most of the analysis that people are doing right now, to make decisions, won't be done by people anymore. So that would affect a lot of things. A lot of jobs, a lot of decision-making. That sort of thing. And I think it becomes a problem for regular people like us and probably the majority of your listeners, when it affects our lives. For example, we're trying to get health insurance, because we get a new job. Well, all of a sudden, there's a whole bunch of data and AI making a decision about whether to give us health insurance or how much we should pay for it, that we have no idea about. We have no- we have lost any control over that decision, because it's being made by a bunch of data that we don't know about, but that is- that was once somehow belon- it somehow belonged to us. So the problem with AI is that it will be making many decisions. Even if it doesn't turn into a giant human-like robot, like in the- you know, in the movies or in science fiction. Even if we don't get there, it's going to make a lot of decisions about us that are important to our daily lives, and that we may not have any control over or any visibility into.

Paul: Yeah. And I think, when I- when you mention that, one of the things that sort of takes me back and makes me wonder is, of course, getting credit. I just read something the other day about credit scoring. And, you know, basically monitoring your spending patterns and whether you can get a good credit score, even to the point where I think that the article was relating Netflix to getting a good credit score as well. And, like you say, what would typically go to a bank as an underwriter, and someone would have a certain amount of information on you, now, potentially, all that information is there on you, that historical information, to make a decision on whether you're worthy of credit, whether you're worthy of life insurance. And, as you say, that can be pretty frightening.

Rob: And I think- I think China thinks that's a great thing.

Paul: Yeah.

Rob: And I think that they believe it's going to be the bedrock, the foundation, of, you know, a more perfect society. And I think that the Western world is going to have to create a lot of laws and regulation to put boundaries around this, to preserve our privacy and to preserve our freedom. Because that's the only way- you know, there are tools, you know. DeleteMe is a great service, I hope everybody checks it out. There are- you know, DuckDuckGo is a good place to keep your searches private. There are tools that all of us should be trying, and if we like them and they're easy enough to use, integrate into our daily lives to protect some of our data and our privacy. However, ultimately we're going to have to draw, as a government and a society, we're going to have to draw some boundaries around this. Because, as you know, technology just gets better, faster and cheaper over time.

Paul: Yeah. And to- on to that point you've just mentioned about drafting or drawing some boundaries around that, I see that Google is now kind of being, you know, dragged into the courts, dragged in front of the Senate as well. Is it about time? Are they too big to stop, for example? Do you think- or it's- you know, U.S. government potentially, or European governments, can start doing something about these too-big-to-fail organizations?

Rob: I mean, I think the courts will decide. But I do think it's time to have that conversation, have the level of government, society and judicial systems. If you think back 20 years ago in the United States, the Department of Justice called Microsoft into court on antitrust concerns, because every PC was being shipped with Windows. And they did a great job of cornering the market. So every time you bought a personal computer back then, it came with Windows and nothing else. There were no other options. That seems- if you- if we look back on it now, that seems so innocent - so innocent - compared to what Facebook and Google have on us today.

Paul: Yeah. Yeah. I just see that this is- there's lawsuits going up with the U.S. Justice Department. I'm just wondering if the out- the outcome of last week's events would change that, potentially. That course of action. But... yeah. It's...

Rob: I don't think so. I don't think that Biden winning the U.S. presidency, assuming our beloved president Trump actually leaves the White House, I don't think that it's going to change the general course of action, which is, in the United States, to re-examine what the biggest data-driven technology companies are doing and what boundaries should be placed around them. And I believe that the U.S. will have a federal privacy law, similar to the GDPR and similar to legislation that's being passed in Brazil and other countries, before the end of the next four years.

Paul: Right. That flows nicely into my next question or point of topic I wanted to raise, because... we talk- we're here talking today about the U.S., and the EU is putting the GDPR regulations- and maybe you can distinguish between the two about the way that data is exchanged in the U.S., and probably it's more confined now within the European Union, because of GDPR. But the element of data trading and data brokers. Is it still quite prolific in the U.S.? And maybe you can give us some explanation about a data broker. What is it they're actually doing? And is there a market for it?

Rob: Yeah, so I think the simple way to think about it is: A data broker is any company that is buying and selling our personal information without our consent explicitly. And it turns out that particularly in the U.S. and much less so, because of the GDPR, in the EU, there's a huge market of companies we've never heard of selling data about us, our families, our friends, our birth dates, our habits, our latest shopping... And we're not here to say that- our philosophy as a company and as a privacy provider isn't that all all-for-profit companies selling data are evil. It is simply that they need to be responsive to citizens. And if what- if one of us decides, "Hey, you shouldn't have the right to sell my personal information and my profile, and I have no profit from it, and I have no control over it. And you're publishing it on the internet so that anyone can search for it on Google." We believe the consumer, or the citizen of every country, should have a right to say, "Hey! You know what? Either give me some money for it, or I would prefer that you didn't include me in your database." It's just basic logic, in our opinion. And again, it doesn't mean that these business models are evil or bad, they just must give consumers that they're building their business on rights. And that's what a lot of the GDPR is about, and some of the emerging legislation in the U.S. that borrows many of the principles from the GDPR, such as the CCPA, which is California's privacy law, and the new version of that, which was just passed last week, called the CPRA, which builds on these consumer rights. So again, back to your question: What's the- what is a data broker? How does it operate in the U.S.? And how is it different from the EU? What we've seen in the EU is the GDPR laws have been very effective in controlling the amount of companies that decide to try to do business buying and selling data without your consent. Because it's expensive, and they might get sued. So in the EU, we see probably only 5% of the number of data brokers that we see in the United States. And in the United States, they have not- these companies, which include names like Spokeo and Radaris and Intellias, Acxiom, and to share some names, you know, it all blends into the credit bureaus of which I would mention Experian, because they are a UK-based company, that also has a credit monitoring firm in the U.S. What we see is just a complete unregulated industry that is collecting and correlating data profiles about us, and yes, selling them to the highest bidder, no matter who that person is. And it could be a political organization, it could be a life insurance company, it could be a- your- you know, an employer background-check company, could be a dating site. All of these companies are buying and selling data about us and without our permission, without our consent. And they're getting better and better at getting more data about us, too.

Paul: Yeah. And just to- from, you know, me as a- as Joe Public out there, what does that look like? What does that mean for me? I mean, they've got- I've got maybe several data brokerages that have data on me, and they're all bidding and suggesting that they've got more data about me than maybe a competitor, and that's- they can put a premium price on that? How does that whole system work, for the listeners out there?

Rob: Yeah, well... the prices that we see- so they do two- the data brokers have two business models. One is to sell to other businesses, and they'll sell bulk access via APIs, which are rights to have their software just instantly check whether or not our data is available, based on en email address or a phone number or some- an IP address, some unique identifier about us. So they'll sell that to other businesses at very different price points. Sometimes as low as, you know, one cent per check or per look-up about us. And that's all going on in the background. We have no- they never publish who their customers are, and they never will. And what's worse in the U.S. is what we see is kind of a game going on, where if one data broker behaves really badly and gets in trouble, they'll just shut down and sell their entire database to a different data broker that springs up with a new name and a new company, and continues on. So the problem, you know, there's a little bit of a game of cat and mouse. And at DeleteMe, for example, we are constantly chasing down these data brokers, adding them to our service, and then they disappear, and a new one pops up, that we then have to go add to our service. And this is one reason why we have- why the customers that we have appreciate the work that we do, because it's a never-ending battle that you have to stay on top of. And it shouldn't be that way.

Paul: Yeah. And is it just basically for the individual? Or, I mean, is- are businesses also- business data also being traded out there as well? About the way that businesses behave, the way- what else is being in that...?

Rob: Yeah. It's a whole lot of data, both based on businesses and employee data, and business customers are our fastest growing segment of customers at DeleteMe, actually. And it's just a lot of individuals's data as well. And the data, as I was mentioning before, the data profiles that are being collected about all of us, are increasingly detailed, and increasingly recent. So, for example, to share a little bit of data that we gather about the data brokers with you: Four years ago, in 2016, we were able to find on average about 850 individual pieces of data at data brokers about a typical customer at DeleteMe. Now that number is over 2200. So they've almost tripled the amount of average data points they have about each person, based on our research, in three to four years. That's quite a growth rate!

Paul: Yeah. Absolutely! And I think what is perpetuating that, of course, social media platforms, but basically anything we're doing online is more or less traced and tracked.

Rob: And in the real world! And in the real world. The problem is the entire world is going- you know, is going technic- is getting more technically sophisticated. So when we go and get a driver ID, or we, you know, go to the post... things that used to be disconnected from data and not well-managed are now connected. Even at the government level. And in the U.S., for example, a lot of those- even government entities are sharing data with the data brokers, which seems crazy, but it's true.

Paul: Yeah. And I suppose since the Cambridge Analytica scandal came out back in 2016, people weren't really under the idea of how far, you know, society or individuals could be manipulated. And I think in the Western world we're very much aware or more aware now these days, but in the past, what happened was- well, not even the past, even today, in more developing countries, that they've attached to that technology - like your Facebook, your social media platforms - and then they basically consume it as if it was every day news, right? And they can influence that. And then I think, going back to that Cambridge Analytica scandal, is that it was basically - I think it was Trinidad and Tobago, they just ran a testbed on the way that they can potentially manipulate that, because they've built up a profile of the individuals.

Rob: That's right. And I remember one of the Cambridge Analytica executives in a Netflix documentary, and I'd encourage everybody to watch it, I remember him saying, "We put an advertisement out there, based on all this personal data, and it went like a boomerang. And we put it on social media, and the boomerang would come back, and we'd see it show up in the results." And that's some of the power and danger of targeted advertising and data personalization.

Paul: Yeah. So I just wanted to sort of slightly move us on a little bit about- we been talking enough about the problems, and we know that DeleteMe is, you know, one of the solutions out there. What other advice would you give out there about making people more aware and more conscious about their everyday activities online? I mean it's difficult these days, because I'm also guilty of it: We just log in, we go in, we see the cookies. Shall I bother with the cookies? Shall I not bother with the cookies?

Rob: It's a great question. And we- you know, as I've said: I know I'm busy, like everybody's busy. We don't wake up in the morning and think about protecting our privacy. We think about the work that we have to do, the kids, if we have families... you know. We have to lead our lives, and our lives are increasingly digital lives. So I think there are a few things that people can do that are probably easy for us to recommend. And we see a lot of our users and customers adopting them, you know. One is to use, when you're br- when you're using your browser, either don't use Chrome. Or if you use Chrome, put an ad-blocker on it. It's simple to install, it's free. And it really does stop a lot of the third-party tracking that goes to a bunch of data brokers, about where you're going, what you do online. So that's a useful tool. We have one. There's lots of others in the marketplace. And, you know, it's a good thing to do. A second recommendation is: When you think about the services you use every day, whether it's your browser or your web email or the business documents that you rely on, try not to do everything with the same company. So in other words: If you use Gmail and therefore Google is fairly knowledgeable about your email, don't also use Google for all of the other things. Yeah, make sure you mix and match the companies that are most- have the most privileged access to your digital life. So your phone provider should be different from your email provider should be different from your web browser. And that doesn't- you know, that's generally a choice that people can make. And most of the services are pretty equivalent in terms of their features and functionality.

Paul: Yeah. And you'd probably say the same for businesses as well that are out there, that, you know, need some degree of protection or sensitivity that they're dealing with as well, that they take the same kind of measures, right?

Rob: That's right.

Paul: Yeah.

Rob: That's exact.

Paul: So I've just a few more minutes left. And I wanted to get your feelings towards the future of- I mean, it's a fantastic service that you're doi- that you've got here. But data brokerage is probably not going to disappear overnight. And I'm sure that there's a case that they can easily move offshore, they can move without the boundaries of law, because it's one of those industries that can be basically replicated anywhere in the world. So what do you say is the future of this particular industry, and, you know, where you're constantly having to evolve? Are they getting smarter or are you guys getting smarter? Is it going to be a constant, you know, push-and-pull situation? Will (inconclusive)- will (inconclusive), for example, regulation help? Yeah.

Rob: It's a- yeah. Great question. And I think you answered it in the question, which is: In our opinion, there is no easy solution. It will be a push and pull. There was a horrible case - a quick story: A judge, a federal judge here in the United States, three or four months ago, some crazy person that was involved in a legal case that she had judicated, came to her house around the holidays, her son answered the door, who was home. And he sh- he was dressed as a Federal Express driver with a package. And he shot her son to death. And she was- you know, obviously horrible. And she was reflecting on the ability of this person to easily find her pers- her home address, just by googling her name. And how- what a problem that was. And how a bit of- you know, that helped this tragedy happen. And her conclusion- and we're- you know, we're helping judges like her pro bono with our DeleteMe service. But her conclusion was, unfortunately, similar to our conclusion, which is: It is not an easy, simple problem to solve. There will be, for all the reasons that you outlined, there will be a push and pull here. But that regulation and the law does need to protect our rights, our rights as citizens, our rights as customers. And we need more services like DeleteMe to help make enforcing our rights easy. Because it's one thing to get legal rights, and some, you know, government passes a law, and we can all feel good about it. It's another thing to take those rights and put them into action in our daily lives. And that's what we're trying to do as entrepreneurs, is trying to make that easier for people, cost-effective, so that they can get on with their lives and be more protected and more safe.

Paul: Yeah. That's- I think that's a fantastic way to finish, on that note. I mean, obviously, Tresorit as well, being advocates of privacy and taking control of your own data. We certainly, you know, follow that mindset as well. And I think you mentioned DuckDuckGo, we've already had an interview with them. And, you know, those technologies are out there. I think it's just bringing awareness to what- they're out there. That's something we're trying to do with this podcast as well. It's organizations like yourself that are really, you know, looking out for people's personal information and the way that it's being treated out there online. Rob, it's been fantastic having you on this podcast. I hope that we can do another one in the future, catch up and see where you're at. But I really appreciate the time that you've given us today.

Rob: Pleasure to be here. Thank you very much!

Paul: And that is all for today's episode of "under CTRL". You can find links to all our social platforms and to our guest in the episode description. If you like the show, make sure you subscribe and leave a review. Join me again in two weeks' time for the next episode.