Can a search engine protect privacy? - An interview with Kamyl Bazbaz
Our search engine often knows more about us than our best friends do – thanks to the data trails we leave as we are seeking answers to common and sensitive questions across the web. An estimated 40,000+ searches per second feed the giant brain we call Google. Every click adds one more nuance to our online profile, exposing our interests, beliefs, and connections. Is there a real choice, an alternative search engine, one that doesn’t dismantle our privacy’s protective shield?
The answer is a clear and reassuring YES. In our next talk with Kamyl Bazbaz, the VP of Communications at DuckDuckGo, we discuss some striking examples of the wide-ranging privacy erosion in the online space and elaborate on privacy technologies and products.
DuckDuckGo, an internet privacy company, shines with an authentic mission in full accordance with their business model. With Duckduckgo’s search engine that doesn’t collect or share any personal information, they aim to “set a new standard of trust online” by giving users back the freedom of privately browsing the net.
Thanks to his previous extensive political career, during which he closely witnessed how social media can influence political behavior, it comes as no surprise that Kamyl has become somewhat of a guardian of public good and was enticed by DuckDuckGo’s privacy commitment.
The way we establish our privacy in the online space should be as simple as our need for it is natural – without us being manipulated or controlled by commercial or governmental entities. This is why DuckDuckGo's goal is “to be this easy button for privacy”, as Kamyl sums it up. This simple message is mirrored perfectly in the way they have designed and are constantly developing the most private search engine: privacy baked into an all-in-one product that is accessible with a single download.
Have you ever been surprised by a very personal information appearing in Google’s search advertising? Have you experienced the Facebook-effect of being supplied with very similar news and surrounded with suspiciously like-minded peers?
Privacy violations in the digital space are so far-reaching that people can only see the tip of the iceberg. Behavioral advertising works so cleverly that users cannot even notice that they are trapped in an intellectual isolation called the “filter bubble”. Kamyl warns of the damaging consequences of this: It won't only result in a biased world view, but can also become a real hotbed for misinformation and radicalization.
When it comes to the question if online privacy can be restored, Kamyl sees many promising signs of change. The monopolies of Google and co. are more and more shaken by multiple forces. On one side, the regulatory environment with powerful data protection laws like the GDPR and CCPA urges a switch to a more protective data handling practice. On the other side, as tech giants’ anti-competitive tactics are coming to light – as recently unveiled by the top US anti-trust committee – the public also becomes more and more aware of the need for privacy.
However, regulations can only do the groundwork. Kamyl stresses the importance of joint forces when it comes to breaking up the tech giants’ dominant narrative and showing that there are privacy options for taking full advantage of the internet. The launch of the “Global Privacy Standard” by leading tech companies, which would make it easier for consumers to assert their privacy rights such as “Do Not Sell”, is just one example of this.
The bottom line is that simplicity wins the day, as Kamly states: “We think anything that makes it easier for people to pick privacy is something worth pursuing.” If privacy options are within reach, can be set up by default and people don’t have to maneuver through the jungle of “terms and conditions”, it can quickly become the routine.
Listen to the episode on Spotify or Apple Podcasts to learn more, and let us know what you think. Scroll down to read the transcript of the recording.
Paul: Welcome to the 11th episode of "under CTRL". I'm your host Paul, and on today's show we have Kamyl Bazbaz, who is the Vice President of Communications at DuckDuckGo. We will discuss how you can change to a pro-privacy search engine, in an era of constant digital surveillance. Hi Kamyl!Good to have you on the show today. How are you doing?
Kamyl: I'm doing great. Thanks for having me, Paul! Thrilled to be here.
Paul: Great! So we've got Kamyl Bazbaz from DruckDuckGo. He's the VP of Communications, and it's going to be a fascinating talk today into the world of search engines, privacy search engines, and the technology that people can use to keep their privacy and their data safe. So, Kamyl... how about you'll just get started with giving us some background to yourself, how you came to the company, and your journey in the company so far, and what's DuckDuckGo really all about?
Kamyl: Sure. Well, you know, starting with DuckDuckGo, we are an internet privacy company. And basically, it's our mission to raise the standard of trust online. We believe that getting privacy online should be simple and accessible to everyone. Period. Too many people believe that you just can't be private online. You know, there are lots of studies out there that would indicate that people want privacy, but sort of feel powerless to do anything about it. And so, you know, we really exist for everyone who's sort of had enough of that feeling of helplessness, had enough of feeling like they're being watched online, they're being manipulated through advertising and other means, caught in filter bubbles, that sort of thing. And by using our products, from Private Search - which is, you know, just like Google or Bing, it's a search engine that doesn't collect any user data, every time you search on it, it's like the first time - to our mobile apps and extension, which are basically browsers, block trackers and let you browse the internet privately.
Paul: And are you doing any other products as well? Because I think when I took a look at the webpage, you've got something with maps and so you're taking- working with some of the mapping companies as well? Is that correct?
Kamyl: Yeah, well. So, you know... with regards to maps and directions, we are rolling out a directions feature that is just a part of DuckDuckGo Search. And so it's exciting, because it is, you know, we're constantly looking to expand the features that DuckDuckGo has. You know, we have Instant Answers, Sports Scores, Weather, all the things you would expect from any other search engine, but private. Which is not necessarily something that other companies are willing to do, respect their users and customers in that way. But with directions, now folks can look up directions to places privately, without needing to share their location or any other private information. So it's a super exciting feature to be rolling out for us, and, you know, we have worked with Apple and Apple Maps in order to integrate this service into DuckDuckGo Search.
Paul: And what's the journey been like? I think you've been in business, well, over ten years now, so... from what I understand, there was just one- it started off with one person? And now you've grown considerably, and obviously you continue to grow. So what's the journey been like for DuckDuckGo, the milestones, and what's the journey been like for you? When did you join the organization?
Kamyl: So I personally joined around six months ago, but I had been working with the company externally for over two years before that. And for me personally, before, my career was really in politics. I had worked for the Clintons for about ten years in different capacities. Been on Senator Clinton's first campaign for president in 2008, when I worked at the State Department, Clinton Foundation and other places. And so, you know, for someone who cares a lot about public good and making a positive contribution, and one that is sort of engrained in the business, not a side thing that is sort of like a "nice to have" or just for brand, you know, when you're looking at tech companies to work at, there aren't a ton that really fit that model. DuckDuckGo is one of the rare companies that has mission really built into the business. And so, you know, when I was sort of thinking about what I wanted to do next, it was kind of an easy decision, considering I'd worked with the company before and was so impressed with the mission, the commitment to the mission. And so I've been really thrilled to be here full-time, for the last six months. With regard to the company's growth itself, certainly I would say our growth in some ways mirrors the public's broad awareness of privacy violations.
Kamyl: So Cambridge Analytica, for example, was a moment where people realized, "Oh, there's a lot happening underneath the hood of the internet that I'm not aware of. That is, frankly, kind of scary and creepy. I don't want companies I've never heard of to use my personal data. I don't want to be targeted or manipulated based on the data that Facebook or Google has captured on me." And so those moments have really helped people understand the problem. And then, you know, our goal is to sort of be this easy button for privacy. And so if we can sort of meet that demand with a really simple solution, we think we can keep growing.
Paul: Yeah, and your mission statement, as I see it, is that too many people believe that you simply can't expect privacy on the internet. And you disagree and have it- made it your mission to set a new standard of trust online. So how would listeners, for example, out there see that- where is that trust? Where are you displaying that trust? Of course, we see that this is a search engine, but what can you tell us more? What's going on behind the scenes with DuckDuckGo?
Kamyl: Yeah, I mean. We... you know... We are working on new privacy technologies and products and features that really put everything into one simple download. So that, you know, I think people feel like, "Okay, well, I need a tracker blocker, I need a- I need like a Privacy Badger from EFF, I need a uBlock Origin, I need sort of- I have to download a VPN, I have to do research on this... All this sort of stuff to try and be private. Change my settings." And it can just feel very overwhelming. And then, you know, I think since- as I was saying before, a lot of the abuses happen underneath the hood, it's hard to really feel more private, 'cause it's not tangible. What we found is that when people use DuckDuckGo, they, you know, in sort of user testing and this sort of thing, people talk about a feeling of being more free. Right? And, you know, when people use Google, there's a- this idea of a chilling effect. Right? Like, "If I put this in, if I put this search in, is someone going to see this one day, and is it going to be held against me? Is my boss watching me? Is this going to be used in a court of law?" Like... All that stuff can really sort of change how you act, when you're being watched. And, you know, it's not the most tangible thing in the world, but when you sort of get that feeling of the freedom you get from privacy, I think people are really thrilled about it, and then they sort of get it. I mean one of the things that I like to talk about is like, it's interesting to me that people will put a Post-it Note over their camera on their laptop, but will still use Google and Facebook. Right? Like, which one do you think is actually more of a privacy threat?
Kamyl: Yeah? Like maybe the company that's watching you all across the internet. Now there are- super weird stuff that's happening, people taking control of cameras, yes, worth doing, for sure. But there's a tactileness to doing that that I think makes people feel like they've actually done something to protect their privacy. Where all the other stuff we talked about doesn't always have that feeling, so we're trying to kind of bake that into the products that we create.
Paul: Yeah. And to that point, I had an experience last night. We were talking about the Swiss Franc currency, and then suddenly, as I opened my Instagram page, up popped an advert about trading in Swiss Franc currency. So it's not just about...
Kamyl: There you go!
Paul: ...about your camera, you got your microphone listening in as well, because we just go through and accept the Terms and Conditions, right? Because we don't want to take the time and the effort to read through what data we're giving away to companies.
Kamyl: Yeah, and it's not simple to read those things, right? You know, there've been studies on this that have talked about the hundreds and hundreds of hours it would take to read all the Terms and Conditions. If you say no to them, in most cases, you can't use the product. And if you can, they will say in no unclear terms that the product will not be updated, it won't have, you know, the best security features, because they can't sort of fix holes or fix bugs, and that, you know, "Either you agree to the terms that we're dictating, or you have to go with a suboptimal version of the product." You know, we don't think that's fair. And we think consumers deserve better. And so, you know, to the point you were making about the Swiss Franc thing, I think what's so creepy about it is that they don't even have to be tapped into your microphone to do that. Right? Let's say someone you're sitting with in your living room just put a quick search in, because, you know, with the location data, they know you're sitting next to each other and can then, you know, feed you and add into your timeline.
Paul: Yeah. And to that point when we think about targeted ads - and it's really more about that in one sense, that more information, knowing exactly where you are... it's fine-tuning, even probably understanding your lifestyle, where you are at certain parts of the day, and targeting those ads for you. Even with a basis on location. So I think we'll come to- my next question is the- you know, these behavior-based ads versus the keyword-based advertisements, where do you see that going? Is it- I mean, it seems to me that it's getting more and more targeted. They're collecting more pro- information about you and they're more profiling you. And it- do you agree with that? That getting advanced?
Kamyl: Yeah, well, you know. Here is the scary thing about all of it: I think there's an area out there that says, "Oh my god, this data is so useful, it helps me as a business predict all future consumer behaviors. Wow! Like this is a magic ball, this, you know, this is incredible!" In reality, what's happening is it's not predicting behavior as much as it's manipulating behavior. Right? So if through all that data, the insights that you gleam is, "This is what scares me. This is what I need. This is what I'm sort of, you know, worried about at this time. Based on my purchases, this is how much money I have. You know, maybe I'm also- let's say I'm in financial distress and looking for relief." Right? All these ads are going to come to you based on that, not necessarily simply based on what you're looking for. And so those ads are going to be more an attempt to manipulate you than to predict you. Right? To- than to sort of predict your behavior. And it's not fair that it's been sort of touted in this way, as just so simply great and magical, when it's kind of preying on people. Of course, there are different levels to it. And I think the political targeting is some of the stuff that we've seen that has actually really, you know, resulted in, you know, misinformation and ultimately spreaded- you know, was spreading violence. But... at the end of the day, it's all this sort of- in a sense to reduce uncertainty. Of which there will always be, right? So like to collect an endless amount of behavioral data to try and sort of make every decision you make have a hundred per cent certainty is a really unrealistic bar. And the damage that's done in the- on the road to doing that, is not worth it.
Paul: Yeah. And do you think that there really is a chance for us now to take back control of our privacy? I mean, there's companies like ourselves and you, and there are others out there, very much privacy advocates. But is Google so much part of our life, for example, or some of the other tools out there, that it's a big challenge to raise that awareness? I mean, you mentioned the Cambridge Analytica scandal, which has raised some awareness. And I've recently seen another- I think another Netflix series, "Social Dilemma". You know, and to understand what's really going on and to your point, you touched on the manipulation aspect, and I think that's what I'm recognizing, how far things have come now with...
Kamyl: Yeah. Yeah, yeah. I mean, you know, just recently the United States Congress, the House Subcommittee on antitrust put out a major report about the anti-competitive behaviors and the monopolies of Google and a bunch of other countries- sorry, a bunch of other companies. And it was very important the House put this out, really, 'cause they were joining other countries in making it unequivocally clear that Google used anti-competitive tactics to maintain their monopoly. The reason why I think it's connected to the question you're asking is: Because in the way the question is often asked, it is incumbent on the consumer to do something about it. When, as multiple state governments- I'm sorry, and federal governments have said, "It's the companies that have done this." And so we should be putting the blame on them, right? Like the fact that people associate the internet period with Google is not an accident.
Kamyl: And it's not just because they were so incredibly good - and they've given the world a lot of great products - but the sense of, "I don't know anything else. Does anything else exist?" That's something that had to be learnt by consumers, and is the result of this anti-competitive behavior. So... I think part of it is, you know, the regulatory bodies that be all around the world sort of stepping up. But I also think that culture is moving faster than the regulatory environment. Meaning, you know, the sort of honeymoon is well over. And so people are asking questions. I think for the first time, you know, even over the last 6 months, there've been more sort of complaints about Google Search than there've ever been before. Right, like, you know, "Why are there so many ads?", "Why is Google Search sort of favoring other Google products?" And so, I think... I think the court of public opinion is certainly changing and we're seeing regulatory action come behind it. And so, you know, I think all those things together means people don't have to feel stuck with simply using Google, and can find other ways to use the internet and still protect their privacy.
Paul: Yeah. And you talk about or mention the regulatory framework. Of course, you know, we got GDPR over here. I see other companies spinning up privacy laws as well, down in Brazil and in other countries. What's happening over there? We've seen the California Privacy Acts kind of launched. Is there anything else that's coming to the fore?
Kamyl: Yeah, I mean... the, you know, CCPA is one of the better laws that we have on the books in the States. We don't have a universal privacy law, which- you know, which is like- which would be like GDPR. But because it's in California, and so many of our biggest tech companies are based there, the regulation is quite powerful. You know, one of the issues with one of the most important parts to us was a, you know, do-not-sell-piece that was added into CCPA. Meaning that, you know, consumers are allowed to sort of opt out of the sale of their personal data, if they want to. The problem with that was: you have to do it on every website, which is, frankly, a pain to do. And the law sort of left open the space to create a universal technical specification that would come from a browser that would indicate, "I'm opting out on every website that I go to." And so just recently DuckDuckGo, along with the New York Times and Mozilla and the Washington Post, Financial Times, Automatic, EFF, Brave, a few other folks, all sort of announced what's called the GPC, Global Privacy Control. Which is this tech spec that we're going to start respecting immediately, that we hope becomes the sort of standard in California and then across the country, allowing people to, you know, opt out of the sale of their data really, really easily.
Paul: Yeah. That's interesting that you've got this global privacy standard that you're pushing for. And I think that's something that's really right at the forefront now of what companies like us should be doing.
Kamyl: Yeah, and... and I can add one more thing to that, which is, you know, and connected to Europe, you know, as a remedy for the sort of Android case in Europe. You know, they instituted a preference menu, or as Google calls it, "a choice screen", which basically mandated that for new Android phones purchased in Europe, when you're sort of setting it up for the first time, turning it on, that you get a screen as you're setting it up that asks you what you want your default search engine to be.
Kamyl: And overall, we find the idea of a preference menu to be a really good one. But the way that it's been implemented in Europe we think inherently disadvantages companies like us that respect people's privacy, and frankly feel like it's a rigged process, where the only people- the only person or organization that wins at the end of this is Google. Consumers will end up having less choice, based on the way that they designed this preference menu. And, you know, we've been talking to the European Commission and been pretty vocal about our proposal, how to change it, and how to make it more fair.
Paul: Yeah. So Kamyl, a lot of people wrongly assume that you're a non-profit organization. In practice, how do you sustain your business?
Kamyl: Yeah, you know, we make money in the same way a lot of other companies do, which is through advertising. We just don't use the creepy kind.
Kamyl: We use contextual advertising instead of behavioral advertising. So it's really sort of keyword-based. If you search for a jeep, you would get an ad for a jeep. And it's that simple. In some ways it's a little bit of a throwback to earlier times on the internet. And, you know, I think we can all have like some nostalgia about earlier internet times and, you know... I don't want to be like "old guy" about it, but, you know, it certainly- it's a simpler, just as effective and far more privacy-respecting means to serve advertising and to make money. And so, you know, that's where our revenue comes from. This year, we're going to do over 100 million in revenue, which is super exciting for us. And so, you know, we can be healthily profitable and respect people's privacy at the same time.
Paul: That's great. And I suppose...
Kamyl: And, you know, I should just say: We were talking before this and, you know, I was sort of joking, but it's true. You know, people will reach out to us and be like, "Where can we donate?"
Kamyl: And it's like, "Thank you for supporting our mission. Just please use us." We're not a non-profit, but, you know, to have a mission like we do, it's- you know, people sort of say, "Oh well, these guys have to be a non-profit, if they're going to do it." And that's, to me, an indication that, you know, big tech has sort of won the narrative on how all this works, right? If people feel like you can't have privacy and make money, that there must be something else going on, it means that like their version of the story has worked. And so we're trying our best to tell people that that's not the case. It's a false choice. You can have both and, you know, still get all the benefits of the internet.
Paul: Yeah. To that point, I was having a discussion last night with a friend of mine, and he was saying, "Oh, you know, I don't mind giving over my search. So if I put something in Google, I don't mind accepting the cookies and things like that." And then of course, we got into the conversation about this whole manipulation thing as well. I mean, is it- you say that there is a shift now, or there's more of an awareness about privacy? And of course, you're seeing that. I mean, what would you like to see as a next step? As you mentioned, you've got this global privacy standard that- something- what else is out there that we can raise awareness to people?
Kamyl: Well, you know, I- just to comment about the conversation you had with your friend, you know, I think for some people, they're not going to care. And it doesn't bother them. And that's okay. The point is that they sort of understand what's happening beneath the surface. And if they are okay with that and they want to continue, that's completely fine. What is the most frustrating to us is that people don't think that there are options, and that they think the options inherently have to be worse, because it is not Google. And, you know, that's because of the, you know, anti-competitive practices that Google has sort of taken to make it feel that way.
Kamyl: Right? So with regards to what we would do about it: We think one of the first things to do is to get rid of search defaults. Right? Like just ask people what search engine they want to use. It feels sort of crazy, but like, you know, we've done research on this and others have, too, that if you just do that, you know, 20% of people would pick a search engine other than Google. And so it goes to show you that there is a pent-up demand for these things. And with a simple product and a simple ability to actually use that product, people will pick it. And so, you know, we think anything that makes it easier for people to pick privacy is something worth pursuing.
Paul: Yeah, and to that point, I mean, I've got your product on my phone, and alll you see...
Paul: I noticed there's a button at the top. You just- it's like a flame. I'm not entirely sure of the icon.
Kamyl: Yes, the Fire Button!
Paul: If you just- you just hit it and it wipes everything. Now, if you go into Google, and you want to clear your cookies or your searches, you've got to go through the privacy settings, so it's cumbersome. And to me, that means it's like- in a way, it's kind of deliberately done, to make it cumbersome. People just don't bother doing it,'cause it's a pain. But what I really liked about the DuckDuckGo is you talk about one button, a simple click, and it clears all of that data, that search data.
Kamyl: A hundred per cent! There was a recent story... about basically internal, at Google, some of their engineers and other folks were reported saying that they couldn't themselves figure out how to opt out of location tracking on their own products. So if Googlers can't figure it out, how is the rest of the public supposed to do it? And so you would have to think that that sort of thing is intentional, because... if, on the flipside, their ability to make things easy is also really strong, so... if, you know, something is more complicated than it feels like it needs to be, that that's usually what they're trying to do.
Paul: Yeah. And to go into that point, it's around the cookies as well. When now, every site that I visit and come up, there's the cookies, you accept them, and some are just "opt in", others are "take a selection, make a choice". And even I, as a- I consider myself to be pretty tech-savvy, is, "Hang on a minute! What am I looking at here? What am I reading here? Is it worth the pain and the effort to do it?" So... those ways, to make it intentionally awkward or difficult, it's really frustrating for me as a tech user as well. So I can imagine that people just lose patience with it or don't really understand what they're accepting. So they just go to the default and say, "Yeah, I'll just accept all the cookies and that's it." And all their search...
Paul: Everything is being handed over. So... and to that point...
Kamyl: Yeah, a hundred per cent!
Paul: It's- even when you clear the cookies, and you wipe that information, for example, within Google or your search history, is it really being wiped away, in your opinion? Is it still being kept there for Google's analytical purposes? Or to be sold on to third parties?
Kamyl: Well, with our app, if you're hitting the Fire Button, everything that is sort of- would be sort of locally stored is gone. Wiped. And what makes our tracker blocking different from others is that we, you know, don't just block cookies, but we sort of block trackers from even loading on the page in the first place, which prevents the leakage of your IP address and other stuff, so... We believe, you know, through our tracker radar technology, you know, we're giving people, you know, that sort of high- one of the highest levels of tracker blocking protections you could expect on the internet. You know, there is, with regards to the question though, I think it's important people understand the difference between privacy and security, right?
Kamyl: So if you gave your credit card number to a bank, and that bank was hacked, and your credit card number was on the black market, that's not something DuckDuckGo is, you know, solving. We're not help- you know, it's not like we're creating security technology for servers and, you know, others. I think for us, our point is: Just collect less data! Right? Like if everyone was collecting less data, or stopped collecting sort of unnecessary data altogether, a lot of these issues would start to melt away. You wouldn't have the same sort of profiling and targeting, and, you know, in sort of keeping all that sensitive data, you wouldn't have the same kind of security risks that, you know, you seen happening with other companies.
Paul: Yeah. And talking about those different tracking capabilities: I noticed something in one of the articles that's on your page, where you're giving advice to different users. And one of them is: "Measuring the 'Filter Bubble': How Google is influencing what you click." Could you just give me some explanation what the "Filter Bubble" is? Maybe for the listeners out there as well, this- that terminology and what does it actually mean, the "Filter Bubble"?
Kamyl: Sure. So the Filter Bubble is a- an effect where, you know, either you are let's say searching on Google or you're on Facebook, and, you know, their business models need you to stay online and to click around as long as possible. So the Filter Bubble means you are getting results, seeing content that is meant to keep you online. And that content is filtered and tailored to stuff you have looked at before. And so, you know, this idea that everyone is getting the same search results everywhere on Google isn't true. Right? Those search results can be based on what you've searched for in the past, and specifically when that stuff starts to take a political slant, you know, that's where you're adding to polarization and, you know, to some extent misinformation across the internet. So, you know, I think leading up to the election in the States, there's been a lot of coverage about Facebook, for example, and how, you know, people can open Facebook and their timeline looks like they're in a different world, because of the news that's been surfaced. And that news is based on the stuff that you've liked, other stuff you've interacted with and then a little bit about behavioral information about you. And so, you know, you're not seeing a sort of- it's one thing to say sort of fair and balanced, right? But it's another thing to say like just what is out there, not only the stuff that will sort of want to click and make me feel more anxious and, you know, keep me online, which is, you know, very much what was discussed in that- in "The Social Dilemma". But, you know, at the end of the day, it's really not about one side or the other, when it comes to the Filter Bubble. It is just that people shouldn't be caught in one at all. And that this sort of behavior can lead to radicalization.
Paul: Yeah. The reason I brought it up, because my son's obviously born with technology, and he's constantly on YouTube or watching videos, but I've started to see all the time the things that he searches for and the different content that's being delivered. And to me, I would say that I've got some concern about the way that he's being kept online through YouTube and various other applications that he's using as well. So do you see- to that point, do you see the- across the research that you do, a particular age range that are using your product? Or... is it coming from- basically, from the young? From the old? From right across the spectrum?
Kamyl: It is coming from really all sides. You know, the funny thing for us is that, since we don't track our users at all and, you know, usernames or anything like that, we really have no idea.
Paul: Aha! Okay.
Kamyl: And so the information that we do have, we'll just survey people who say they're users, and try and get information that way. And, you know, it's been nice to see that it's really kind of universal. And, you know, privacy is something that everyone cares about. Sometimes different things bring them to privacy, but overall, it's super across the board. You know, you have some folks who're really, really tech-savvy who care about us. And then you have some folks who are like, "You know what? Like I just don't feel good about, you know, like the way I've been using the internet. I'm feeling watched, I'm feeling spied and preyed upon. I just got to do something else. And so I'm going to try this out." And so, you know, no matter your sort of technical understanding or even, you know, political affiliation or age, you know, would determine whether or not you would be a user. It's really for everybody.
Paul: Yeah. Yeah. I just feel that when I see my son using the technology. And sometimes I think that he's more privacy-aware than me, because he makes a point of going in and clearing things. And what I'm just curious to understand is: We make an assumption that young people are not that aware. They just accept it, because they were born into technology, they grew up with it, they go into Google. But I think that... what I'm seeing is that there seems to be a consciousness amongst young people now, about privacy particularly.
Kamyl: Yeah. No, I think that... folks who were sort of born into this age are far more aware. They're certainly, it would seem, more savvy about how everything lasts online. Right? This sort of like awareness that if you comment on something, that can come back to haunt you. But, you know, it's like, you know.. I would be curious if your son, for example, like understands Incognito Mode on Chrome. Right? So Incognito Mode isn't actually private. What's so funny about it is that it's like private to you. Right? So like private to you meaning like there is no history, right, that you can't sort of hit back and whatever website you go to, it's not on your browser, but Google still knows where you went. You know? Like there's still trackers, there's still cookies. It's like only private to someone else who sat on that computer, which like isn't so much the point. You know. Especially now, when people have so much more of these other things. So, you know, to us, we've seen a lot of efforts that, you know, we're calling like Privacy Washing. Which is an attempt by a brand to talk about all the privacy benefits that they are supposedly delivering, but then not actually following through on it, or sort of over-hyping those benefits, based on what is actually delivered. So Incognito Mode is a great example of that. You know. Also Google recently announced that they're going start to- for new Google customers, they're going to start to automatically delete stored behavioral data after I think something like 18 months. That's a nice announcement that got some headlines. But after 18 months that data isn't as useful anymore.
Kamyl: So, you know, they're only doing as much as they feel like won't hurt their business. Which is not real progress.
Paul: Yeah. And I think we're coming up to the end now. So I'm going to wrap it up with the- with one final question. And I think: What does the future hold for DuckDuckGo? Is what else do- have you got anything planned in the pipeline about increasing the capability of the search engine or adding additional features or products to DuckDuckGo? Where you can help people keep their privacy.
Kamyl: I mean, we certainly have new products in the pipeline, which people should look out for. Follow us on Twitter at DuckDuckGo, check out our website, check out our blog spreadprivavcy.com. There's a lot of great information there, how people can be more private, how to use our products, how to make us you the defaults on your various devices... I'd encourage everyone to give it a try, if they haven't already. And I think with regards to the future for us, you know, I think because we are brave enough to put people and their privacy ahead of profits, and make that our actual business model, we are in a position to succeed and grow for a long time. Because more or less no one else is doing that. There is such a, you know... people are so hooked on this surveillance capitalist data, that is this surveillance capitalist model, that they're afraid to try anything new. We've never felt that way, and so we feel like we have a ton of opportunity. And, you know, there's so much room for growth still. And so, you know, the more we can talk about these issues, you know, work with partners like you guys, and talk about all this stuff, I think the more we can sort of spread privacy and actually elevate the standard of trust online.
Paul: Yeah. Absolutely. And I think, you know, this is the whole purpose of why we're doing the podcast, is to speak to folks like yourself, who are privacy advocates and have a mission out there to protect people's data and what they're doing. So... it was fascinating having you on the show today, Kamyl. I really enjoyed the discussion. And it was good to get some insight about what you're doing. And I wish you all the best for the future. So thanks a lot for coming on!
Kamyl: Thank you so much! Really appreciate it.
Paul: And that is all for today's episode of "under CTRL". You can find links to all our social platforms and to our guest in the episode description. If you like the show, make sure you subscribe and leave a review. Join me again in two weeks' time for the next episode.