The digital tracking and monitoring methods used by governments, law enforcement, and organizations – both democratic and autocratic – put privacy at stake. That's why implementing basic digital hygiene precautions is important for everyone, but mandatory for high-risk individuals, journalists, and marginalized groups & communities.
Whilst the digital space has created new opportunities for developed nations, this shift to a dominant online world creates new risks and threats and highlights social inequality and resource gaps. To discuss different forms of social injustice and privacy intrusion that are increasingly apparent in cyberspace (and how to combat them), we invited celebrated human rights defender, hacker, and security trainer Matt Mitchell to speak with us on under CTRL.
After direct experience in employee-monitoring jobs and witnessing his communities' exposure to government surveillance, Matt now advocates for positive change through education around digital self-defense tactics.
Considered one of the most influential minds in the cybersecurity community, Matt has a wealth of experience from his work in cybersecurity as a hacker, data journalist and advisor to NGOs. In his current roles as a tech fellow of the Ford Foundation and the co-founder of CryptoHarlem, he invests his energy into making the online world more private, secure and fair, whilst ensuring minority communities gain access to basic cybersecurity knowledge.
Expect to hear about the following points in the full-length episode:
- The nature of surveillance in the digital age, from target groups to modern tracking methods
- The level of cybersecurity awareness journalists have, and the most powerful means to get them to care about digital threats
- Matt's experience-based educational approach and a summary of the most important takeaways from his training courses
- CryptoHarlem's founding story and mission to strengthen digital privacy and security in minority communities using encryption
- The Ford Foundation's efforts to drive social justice and support NGOs in tapping into their potential whilst securing their digital assets
Want to hear more insights from the top cybersecurity experts of our time? Check out the previous episode of under CTRL featuring ex-NSA engineer and co-founder of Glacier Technologies Alex White – and stay tuned for more episodes on Spotify.
Balazs: Hi! Welcome to "under CTRL". My name is Balazs Judik, and today's guest is Matt Mitchell, who is a Tech Fellow at Ford Foundation and founder of CryptoHarlem. Hi Matt! Nice to have you here. How are you today?
Matt: I'm great! I'm happy to be here.
Balazs: Lovely! Lovely. Alright. Listen, there are a lot of projects that you are involved. Could you give me a little background of yourself, before we kick off this whole dialogue today?
Matt: Yeah. I mean, I'm a human rights defender and a computer hacker. So that's who I am. I founded a thing called CryptoHarlem. I'm a Tech Fellow at a philanthropy called the Ford Foundation. And I also work to train, you know, folks who are working to make the world better.
Balazs: Well, that's quite an impressive set of different areas that you're working on. Could you give me a little background of- how did you get to this point, where you're working on three different areas, helping a lot of people? How did it started with you? Or what was the motivation behind, when you were, say, in your early twenties?
Matt: Oh, yeah, sure. Well, I'm the child of immigrants from the Caribbean. My dad from the island of Grenada, my mom is from the island of Trinidad. You know, growing up, we moved from the UK, where I was born, to the US, which is why I sound like this. And when I was a kid, the island of Grenada, my dad's island, was involved in a three-day conflict with the Unites States, as a war. It doesn't last that long, it's a small island. And I learned a lot from that experience, about what it's like when truths are not the truth, and what you see on the news can't be trusted, 'cause it's not the same as what you hear from family and friends on the phone. And so I really became sympathetic to folks who are kind of being perceived differently or watched and things like that. And then- when really I got a job, it's like one of my first jobs. And it was this IT type job, you know, like just setting up machines, and I think we were rolling out Windows new technology or something, I don't remember. And about like two months into the job, they were like, "Hey, let's gather us all into a room. And then like let's talk about what your real job is." And I was like, "Okay, I'm intrigued. What do you mean, real job?" And then they said, "You're here to help us monitor employees. And we're going to be watching what they're doing and how they're doing it electronically on their machines. Especially those with foreign passports." And I was like, "Whoa! Whoa, is this legal?" And they said, "Yeah, it's in their employee agreement. Maybe they didn't read it. Maybe it was fine print, but it's there." So I was like, "I can't believe this. Like this is a thing." And no one I knew outside of work knew this was a thing. So like-
Balazs: What year was this though?
Matt: This is in the 90s. Early 90s, you know.
Matt: And so like this is a common thing. Like every corporation does this. It's to protect the corporation. Sometimes it's in the marketing department, sometimes it's in PR, sometimes in- it's IT. But that's where the watchers live. And maybe there's a corporate investigation unit or due diligence unit, but they got some hackers up in there. And so, you know, I thought, "I got to get out of this. This just doesn't fit well with my personal morals." You know. But the key words in my resume, I didn't realize, they were just perfect for another- you know, a surveillor. So I got another job watching people, you know. So I was like, "Oh, this is really frustrating."
Balazs: What were the things that you- you know, the company was asking you to look for? Were there any specific measures that you were told to do your job for?
Matt: It looks like endpoint protection, but it really focused on the humans. So what that looks like is like you'll put something on the machine that will monitor things like the size of the download, the size of the uploads, what network activity, every website that's been visited. It also does keyword searching. So it's- it could be a key logger if that is audited, if that flag is switched, right? Because that's a huge amount of data, so they don't want to collect too much. And yeah, that- and then it was a file that we could just pull up and see on a- basically like a SIM, you know. So we could see basically like a monitoring log, monitoring software. And then we could silently report it. It was like, "Hey, look, if the needle is in the orange, the computer is telling you there's probably something in here. You as a human, just get another pair of eyes, and then send it to division," or whatever, you know.
Matt: And one thing I learned was that the folks who were flagged, some of them, you know, like when- in the time I was there, they just- they weren't working there anymore. I'd go to their lab and it'd be like, "What happened to this scientist?" And they'd be like, "Yeah, he's not here anymore."
Matt: Yeah. So I mean, I think the impetus for- with things like sexual harassment cases and things like that, things where the company were just taking like- they had no idea. But then slowly given the power to see into all that, they were looking at performance, they were looking at things like: well, this person appears to be a hardest working person, are they really?
Balazs: But was it rather-
Matt: Is it industrial espionage?
Balazs: Was it rather a matter of precaution? How they tried to like put it? This whole surveillance aspect, or...?
Matt: Yes! That's exactly- that's exactly why. It's to prevent against: one, maybe this person's falsely accused of a human-related crime. Two, maybe there's something installed on their machine that's doing things and they're not really doing it. Three, maybe this is an insider threat, so how can we know? And so- but then given the tools that you have some basic evidence on never to be used, "break in case of emergency"... emergencies are relative, and slowly smaller things become emergencies. And then all of a sudden, you know, there's an uptake in what my team was doing, you know.
Balazs: Okay, alright. So you went for the second job, where you ended up pretty much in the same shoes, where you'd just started off. What then?
Matt: Well, luckily the contractor that I was working for had a contract with the internet company America Online. They would mail people CDs. This is how you'd get a- this is how you'd get the card. This is how you'd visit the walled garden of the internet. And so I started working there. You know, I'm a software engineer, and I'm a computer hacker, so... they needed me to just write code, which was a lot easier and felt a lot better. And in my time there, I met folks. It was- who were doing like news- blogging was really new. There was just this new company they'd acquired called Weblogs Inc. And they were like, "Oh, blogging is the future!" And it was a huge boost to their traffic. And it got people so excited that Arianna Huffington came in. I'm like, "Oh, this is great!" There was an idea for a blog called the Huff Post, it's news. Boom! Next day I know I'm working for like a news org.
Matt: And yeah, that meant to me- really liking it. I liked working at a news org. I liked- there was a blog called Black Voices that was about African Americans. I liked that. And so I started working in media after that. My next jobs were all in news. So since that time, I've worked at Time, I worked at CNN, I worked for a bunch of different Turner outfits. I worked for the New York Times. So, you know, I really enjoyed being in a newsroom. One, I get to like use my code to do stuff that I felt was like highlighting truths and like telling people stories, and making this world better that way. And the other part was like, you know, someone would be like, "Hey, I'm working with a secret source, and like they're gonna write me this email. You said something about that? It's not good, right?" And I was like, "No, don't do that! Let's sit down and talk." And then we'd have like a brown bag lunch, and me and one reporter would quickly become me and two and three. And there'd be this small little crowd. And I was like, "Wow, there really is a need and the interest for how some of the hackers steal stuff, on how to be private and how to be careful with our data that I've learned and acquired over time." So that was really cool.
Balazs: Did you feel like there was a gap in the journalist knowledge about how to protect their privacy and the topics that they're covering? Or...?
Matt: Yeah, a hundred per cent! I would say your average journalist- you know, you go to J-School, you get your degree, you get your, you know, whatever. You start applying for jobs. You're a freelancer at first. There's a lot of pressure to just file stories from a lot of different editors and make- build those relationships. You might be sending the ship- you know, pitching the same story to five different news orgs, hoping somebody'd buy it. It's a hustle. You finally get a job. You're working in the news... you start- you enter into a CMS system, like- it's kind of like a super WordPress type thing. You start writing articles. Next thing you know, you're in the paper that people read every day. At no point in that journey are people like, "Let's talk about encryption," you know? You barely- your average journalist is a strange animal where the motivation- time is the enemy. It's not really a team sport. You're- you work and you enjoy your colleagues, but you're just trying to get your name, your byline, you know, "the story's by me" on the next bigger story, on the next one, and... you know, there's not a lot of room for things that slow you down, like being private and being safe.
Balazs: Now, why would they? I mean, is there a specific segment of journalists that you would refer to who really need to protect their privacy or their entity or the stuff that they work on or...?
Matt: Well, there's an idea from the outside that if you're doing like National Security or, you know, you're covering corruption like in the government or something, then you need to secure your stuff. But that's not really true. Actually, what happens is people who don't want that truth told and are sitting in seats of power, they can manipulate stuff, they can hire hacker mercenaries to come after folks. And hackers are lazy, you know what I'm saying? Like that's my people, but we're lazy. And we want to do the easiest thing first, you know. You do a contract, and you get a completion bonus based on getting it done early. Otherwise hackers would be like, "Yeah, it's taking an hour. An hour. It's taking another hour." You know, so like- it- no, it's like look, there's no honor amongst thieves. They know how it is. It's like: finish early, get more. So you're not just going to like drag things out. Soft targets help you finish early. So instead of focusing on that very famous reporter, you're focusing on someone who works in culture, someone who works in metro, someone who is doing stories about children's toys, 'cause they're not thinking you're coming for them. And once you get into the Content Management System, you're in the Content Management System. Once you're in the machine that everyone plugs into, you can hop to the real target. And so actually every journalist needs to know a little bit about this stuff.
Balazs: Well, and how do you start to go about this? Because I feel like it's a quite a large chunk, if you want to change the way how a global community of journalists are thinking about security and privacy, especially because, as you say, if there is a- the weakest link of the chain effect in this whole situation, then everyone needs to be protected equally. Not just the ones that are most affected by a potential threat.
Matt: Yeah, and this is where we make a difference from individual security to organizational security. You know what I'm saying? So it's like, "Look, you're not in the Tundra with a wolf and a stick, on your own. You're in this organization, you work at a company with hundreds of other people. If your security is too high, fine I'll back away. That fence is much easier to climb over."
Matt: So now, it becomes like a weakest link story, where the security of everyone is equal to the laziest person who knows the least. And that's a fix that needs to be done on the organizational side, not the individual side. So you approach it totally differently. It's not like, "Yo, you shouldn't install Tresorit or Signal," or something. It's like, "What is our security policy? Which is our understanding of where everyone is at. What is everyone complying to with like 80%?" And in the beginning, it's like, "Oh, we're all locking down our laptops and we know all that stuff." And someone at the back is like, "I'm not doing that." Right? So then you find the common denominator is very low. But that's a good thing. That's true, that's what you should find. And then you just make policies that move that up slowly and rise all ships. You know? So that's how you protect an actual news org, whether it's five investigators meeting at a coffee shop, or it's a huge paper of global demand.
Balazs: Yeah, and that sounds like something where you would like to change a culture of a company. But when it comes to individuals, and say there is a person with a high threat possibility. Do you work with some of those people as well? I remember you mentioned it earlier. And what could be a potential threat for them? How do they live their life? How do they find you and stuff like this?
Matt: Well, before I was working at Ford Foundation, I worked for a private security firm. And they were- we were luckily one of a few that had like legit clients only, like NGOs, lawyers, journalists. It was really cool. And what we would do is: they would be like, "Look, this is what do in the random, you know, chance that there is a shooting." And they would set off like firearms with blanks. They would have like controlled munitions and explosions and things, tear gas and all that stuff. And then they would explain like, "Your body wants to do a couple of different things. Run, maybe. Freeze. You know, wait for help. None of those are the right place to start. But that's who you are, you can't fix it. Let's start there and map you towards safety." And then they'd be like, "Okay, we do simulations here." And after I went to a couple of their courses, I was like, "I see." So I did the same thing. So I'd be like, "These are two laptops. This one, the camera is owned and I can see it from the second one." And lots of people tell you when you put a Post-It on your camera or something... but you've never seen it. And when you see that, you're like, "Whoa, that remote access tool, I can see it next to this one." And you have both laptops in each hand, and you're like, "Wow!" That's a moment you don't forget. When I'm like, "Look, this is how you spoof a phone number. We're gonna do this process, and I'm- you're gonna call yourself from another number in your contacts." And then your phone rings, and it says: Oh, it's my partner. But it's really just you. You're like, "Whoa, I'll never trust a regular calling app again." So, you know, you got to make it visceral and real, because people- you got to meet them where they're at. And this is just an extra thing to do on a long list that's important. So if you don't get that moment that's like, "Oh, wow! This is real!", you'll lose them. And that's what I think is the best approach. But you have to do it ethically, you got to get it with their permission. Also, you got to have the skills to know how to do that.
Balazs: Yeah, right. And what led you to acquire these skills? Because there's a big gap, starting corporate, figuring out that surveillance is not your thing. You develop some kind of anti-taste towards that, and then you move to the news outlet. And you met a lot of journalists, who were part of this whole big threat that's out there. What happened in between? How did you acquire the skill set that you can help these journalists with?
Matt: That's a great question, man. I acquired the skill set to help journalists, becau- in a way that- you know, I was a kid and we had just moved here and... everything was like a little bit different. The school system was a little bit different. People talked a little bit different. And they rolled out this machine, and it was the one computer for the whole school. Now every student gets a computer in a lot of schools. We had one computer for the whole school. And it was a Commodore PET computer. And I was like, "What is this thing?" You know, and they showed us how to like use basic programming, this- I think a logo, you draw and stuff, with a turtle thing... And I was like, "What?!" And my dad, he worked on the train yards. And he'd always bring stuff home and take it apart, and I'd take apart my toys while he was doing that, pretending that I was him. Then he put the stuff back together, and my toys would be like forever in disrepair. But I learned that like inside everything is something. And sometimes even cooler than the dinosaur is like how that gear makes it walk, and the springs, and all the other stuff. And I was like, "What's inside of this computer? How does it work? There's not springs in there." And I was captivated. And- so my folks- I was like, "Yo.." They went, "What do you want for the Holidays?" And I said, "I want a computer!" Which is unattainable for them at that point. We'd eventually- we got one. But- I ended up building one. But they were like, "How about a magazine subscription?" 'Cause they thought it was like fishing. "You want to be a fisherman? Here is a fish magazine." But I was actually super frustrated. People with their computers, you know, so. But they had- it was called Compute. And in it, they had this machine language, where you would just like enter in this code, and I would like look at it and enter it into graph paper. And I'd draw my little cardboard keyboard, and I'd tap on it and pretend I was on my keyboard, you know. Wait for the next one to show up. When the mailman showed up, I'd be like, "Oh, another Compute mag." And a kid in my neighborhood, this dude Ian, he had a computer, an IBM Junior, whatever. And his- he was like, "Listen, my dad's got a computer in his like little business room in a little library." And I was like, "What?!" And I got on there, and he's like, "How are you doing all this?" And I was like, "Doesn't everyone know this?" And I was running with weights on for so long that I just became like super computer-skilled. And that led to my interest in tweaking the programs, changing them, changing other people's programs, next thing you know you're hacking the programs. I met some hackers in school. They were like, "Yo, this is how you can get stuff." And I was like, "Oh, this is cool. How do you guys get that game with that software?" And they were like, "Oh, you can just take it. You know, this is like- you just snap your fingers and you pull it out of the air." And I was like, "Fascinating, but it doesn't feel like it's the right thing." And I realized I had a career-ending injury as a hacker, which is empathy. Like I care about other people. If you fall and trip, like I don't laugh. I want to pick you up. You can't be like that with a hacker. You know, you have to have a very limited world view and position on what's right or wrong, and it really- and to yourself. So they were like, "Yeah, you got skills, but you're a bummer to hang out with, Matt."
Matt: So I- luckily, I was on Long Island, and there was this magazine called 2600. It's a hacker magazine. And you could find it on some magazine- news- you know, where the newspapers were. You could not- it seems like someone was just leaving them there. Like it's- you know, the people like- I think we saw this, you know, (inconclusive). And I would really covet the cover. And I finally went to a meeting, and the people didn't look the way I thought they would. You know, they'd be like, "I'm the superhackerman3000." And then you'd go there, and it's like- this like super, you know, hermit-looking dude. And they all went around and introduced themselves. But when they got to me, they skipped me, 'cause they didn't think I was there for the meeting. They thought I was just some like black dude hanging out at McDonald's over there, you know? And then I thought, "Yeah, like, this is like a comic book. Why would you want to let everyone know your secret identity? I like being a fly on the wall." And that's how I just kind of got into this thing. And I just built my skills from there, and I was like, you know, "One day, I'm going to need this to help other people to just keep them safe from the people at school. To keep them safe from like, you know, people who were, you know, really like being wizard from the future, in a world where everyone leaves the doors unlocked." So I was like, "One day, folks are gonna need help." And that's how I got into it.
Balazs: Yeah. And what do you see as the biggest challenge when you are working with say journalists? Because you said something like you got to meet them at- where they are. But is that the biggest challenge really? Or is it just them being not aware of cyber threats, privacy and so forth? Or wh- how do you see this?
Matt: I think the biggest challenge- you know, like I literally would meet them where they are. So it's like at their desk, in their village, in their country, whatever. Is- they have a- they're not risk adv- they're not like a risk-adverse community. Like they jump and run towards the fire. They run to- towards the noise. That's what makes you a good reporter. And that's really nice. It's like working with someone who is like really easily coachable. Most people just don't understand how news is made. They don't understand the pressures of being a reporter. And then they're overloading reporters with stuff they would never- it doesn't work. So I think like the biggest mistake is, no matter what the community is you're trying to safeguard or secure, you need to spend some time getting their trust and understanding like they're experts at something. They're geniuses at something. You need to learn that thing, and all your lessons need to be steeped in that language, and that workflow, and those pressures, or no one's going to keep them up. They keep them up for a couple of days, couple of weeks, but it won't really lead to a trajectory change where they end up. You know? They'll just undo it, just as fast.
Balazs: Could you give me like three examples of which are the like most frequent advices that you need to give to these journalists? Like it can be very practical as well, to...
Matt: Yeah, yeah, yeah.
Balazs: ...keep your passwords like ten digits, or use this tool for messaging, or use Telegram or whatever? What would be the three top advices?
Matt: Yeah, yeah. (inconclusive) Well, I usually- they have a lot of knowledge that they've acquired from non-professionals. Some of it is great, some- it doesn't work at all or is horrible. So you got to empty that cup. And I'm just like, "Okay, look, what are you doing? You're taking notes, right? Are you taking notes with a pen and paper?" They're like. "Yeah." "Okay, cool, you're secure." They're like, "I am?" And I'm like, "We're hackers, not cat burglars. Yeah, get a fireproof, waterproof bag – those things are pretty cheap – and put your stuff in there. Get- you got one with a lock, or you can put a little thing on it, you're pretty good. Like now someone needs to like show up at your house." Which does (inconclusive). That does happen to a lot of reporters in different parts of the world. "And then you got to get two bags. Put one with some, you know, very boring reporting under the bed or under a floorboard. People will search 'til they find it. They'll stop there. You know, the real one, is like deeper down, you know, somewhere. That's cool. But once it's electronic, it's fair game. That's my world." So that's something they're starting to understand. The separation between what is really hackable and what becomes a physical problem. So I was like, "Anything you can offload to the physical world might make sense for some type of reporters." The next thing is that you will need to use- like if you're taking notes electronically, I tell them to use an encrypted note-taking app. You know, in the last lessons I'm like, "You have an app for everything." Encrypted note-taking app. But the first lessons, it's- like Standard Notes is for example one of my favorite encrypted note-taking apps. It's on iPhone, Android, PC, Linux, Mac, it don't matter. It's free, if you want it to be. You pay and it gets better, you know. But for the beginning I'm like, "This is the only tool. It's called Signal. You do everything in there." And I teach them like, "You need to master this tool. And once you master one thing, you can go to the next thing. And don't have a multitude, have mastery instead."
Matt: That's the hard thing for them to learn. They don't realize. They're like, "Oh, use this one and this one." It's almost like they become collectors of all these- and I'm like, "No. You need to understand the core concepts." Like I was like, "Note to Self is a feature in Signal, you put your own phone number in, and you can write your own encrypted message to yourself." People don't know that. They've never seen it before. And I'm like, "That's your note-taking app. You know, that's where- move a file in there. Now you move the file to an encrypted part of your machine." You know, things like that. And I'm like- I also showed- told them that having it on twenty different things like, you know, different laptops, different phones, is not a good idea, because that's twenty different things you need to safeguard.
Matt: Just keep it on your daily driver. Keep it on that daily used regular phone, and really just know everything about it. How do I verify that the person's number is really them? What does this QR code mean? All that stuff. Before you move on to the next tool. And that might be a year, like, no lie. So I'll be like, "Let's spend twelve months on this." And they'll be like, "Twelve months?" And I'm like, "Yes. That's mastery level." You got to be using the same thing over and over as everything. It's not- if it doesn't work well, it's not a Swiss Army knife. Or it is, maybe, because- no- not disparaging Swiss Army knives, but everyone knows: You have ten tools that barely work, instead of one tool that only works on one thing. So, you know, Signal's great at, you know, being a simple, easy, no menus, no complicated things, for beginners to try to step into. But then I quickly move them on to the next thing and the next thing.
Balazs: Yeah. Right. So you're building it up altogether, I imagine?
Matt: Yeah. Yeah. Eventually you end up with the right tool for the right situation. Like if you open up the cabinet, if you're not feeling well, you know, a medicine cabinet, if they only have Aspirin for your headache or, you know, Paracetamol or whatever, we call it Tylenol here. You know, like if you only have that, you will take it for everything... A bullet wound, you will take it. A headache, you'll take it. You know. Nose pressure... but eventually, you're like, "This one's what I take for this. This one's what I take for that." You know. This- "I cut my finger, I put a Band Aid on." You know what I mean? You're not rubbing Tylenol in there. But in the beginning it's one piece of medicine, and eventually you become a doctor with like a tool bag.
Balazs: What would be the next step? Like say a mid- or more advanced way of working? After let's say you master Signal. What would be the next one or two steps?
Matt: The next step I teach them is like how to encrypt their device, so if it's lost or stolen, or, you know, a hacker picks it up, it's useless. Right? And in that lesson is also teaching them some basic forensics. Like, "That password you type in every morning means nothing." They're like shocked. They're like, "What?" And I'm like, "That's permission to use the operating system. I don't want to use your operating system. I want to know what's in your C drive. I'm a hacker. I don't need to log into the front door. I could just plug a cable into your computer and drink all the data out." And they're always like, "What?" And I'm like, "Yeah, in fact: Macs, they have a thing called Target Mode, you can just turn- hit a key, and it restarts as a hard drive." And they're like, "What?" So yeah, I'm like, "Yeah, this is all real." And then they're like, "Well, why do they do it like that? And how do I fix it?" And they're asking the right questions, 'cause they're journalists. They're always asking questions. And you explain like, "This is how we fix it. This is why it's- we're going to full-disk-encrypt your machine. Now when you lose it, it's useless." And then they're like, "Great!" Then I'm like- they start learning like, "Oh, so my password actually means something now. My password is the name of my kid. That's probably not a good idea." So with that, password managers and having something that's not even known to you, it's known to the program that you use, becomes the next lesson. And eventually, you kind of end up with: What are they touching the most? Do they use Microsoft Word the most? They're using a laptop the most. They're using a- some software to file a story to the- you know, let's lock down those things. So I look- I really start to, "Yeah, I'm just going to hang out with you. My first day, I'm just going to time what you're using the most. Maybe ask you a few questions." And I stay really quiet. And those are the things you lock down next, and the order you lock them down, in what they're touching the most. Before you start giving them like strange things they never thought about. But eventually, they start thinking about like, "Okay, I need to store my files some place. When I send my files, they're not storing it this way, are they?" And I'm like, "Yeah, they're not." And then they're like, "I need to talk to my editor. I need to talk to the desk." You know, like they start getting- they understand. And it's about building that curiosity in that- if you start off with: The world's out to get you, it's hopeless. Then that's not- where are you going to go from there? So I teach them, "There's this invisible stuff on your hands. It's called bacteria. They're little invisible monsters, they make you sick. If you use soap and water, you can wash your hands, and they go away. If you use the same door of your roommates, and they have these monsters on their hands, 'cause they use the bathroom..." And you're like, "Ew!" "Yeah, they have to wash their hands, too." And they're like, "Oh, I got it! So I need to take small precautions. I get it. (inconclusive) Everyone around me needs to use those small precautions, too, to help me really be healthy." And then, yeah, that's exactly- I just really talk to my community how this is building up to the next thing and the next thing. But there are things that are like, "This is cancer, and we haven't figured it out yet." You know?
Matt: Like there are things that, if there's a nation state working on a zero day exploit in a Mac, like... yeah, there's no fix for that. But like I said, hackers are lazy. And even when you have that tool, it's not the first one you use. You go with the phishing email first. You go with all that basic stuff first, 'cause it's faster and easier, and it gives you all the keys. And so, yeah, I mean like they start learning those lessons, and it's a good day.
Balazs: Right. No, that totally makes sense. And I think if you look at privacy as a hygiene question, then essentially, we could summarize it that you help people to wash their hand, and make sure that everyone else around them is washing their hand as well, right?
Matt: Yeah, and maybe taking a shower, too.
Balazs: Yeah. From time to time. Cool, great. What about CryptoHarlem? Could you tell me a bit about that? It's very interesting.
Matt: Yeah, I mean, I was working at the New York Times, and we were covering quite a lot of things around the world. And there was this case of a young teenager. It was at my transition. I remember the case started when I was like at CNN. It was- nobody was talking about it, but people used social media for the first time to really say like, "Shame on you, news, for not covering this story of a teenager in Florida, who was shot and killed a block away from his home, when he went to buy some candy, some Skittles." Right. And national news was like, "Okay, what is this? People tweeting us? Facebook pages?" It's like a moment in time that you can't really replicate. So they brought this story to national attention. CNN sent people there. Now I'm at the New York Times, and it's the day the case is over, and the guy who shot this teenager was acquitted. And I thought, "Man, this guy, just like a young black teenager, reminds me of a young version of myself." You know? He had a little hoodie over his head, you see this big smile on this goofy kid. Who could he have been? You know. Could he have been me? Could he have been my friend, my colleague at work. Anybody. They're just gone. And that really hit me hard. It hit everybody in the newsroom hard, but they were like, you know, "Dude, that sucks!" And I was like, "Yeah, it does suck." But for me, I could barely get through the day. And one of my other black colleagues was like, "Yeah, I'm really dealing with this in a weird way. I feel like a family member died." And I said, "Yeah, I don't know why. I didn't expect this." And I was like, "I need to do something." So I put that energy into teaching them folks in my community about surveillance. Because in this case, it was about community surveillance, community watch looks for crime, saw this guy, started following him, they got into an argument, and a gun was pulled, the kid was killed. Now I was like, "Wow, like that reminds me a lot of like my life, of being- working with surveillance."
Balazs: When- (inconclusive) of corporate, like?
Matt: Yeah. So CryptoHarlem is really just teaching folks in the black community, like, "Look, this is how surveillance works. And there's a digital surveillance, which is even more insidious, 'cause you never see it. You can't feel them following you. You don't like get a sense of those eyeballs."
Balazs: Right, but how would you go about the fact that, you know, also there are a lot of talks about airports are introducing AI and face recognition. And a majority of these outlets are saying that they do it for the cause of precaution. How would you value that? Because as you say, it was community surveillance. Like how would you- what's your opinion? And just in general about this, like the good versus bad.
Matt: Sure. Well, a couple of things I learned is a lot of these technologies are very complicated. And, you know, I understand machine-learning and computer vision and software that works on that stuff. And it would just bore you to death reading about that. But you have to know, before you could realize that it doesn't work that well. And a lot of marketing stuff is not really that good. You're basically training these machines to look for something out of what you've already fed them. And if all these faces look the same, and even if they're different, you know, like the machine isn't that smart. It's actually kind of dumb. And it makes a lot of really horrible mistakes. And the price of that mistake, if it's like you being a little delayed on a flight, that's okay maybe. But it can lead to you being imprisoned or, you know, you're a terrorist, or you're on a no-fly list – well, that's- we shouldn't be using it. And it's really like teaching the folks and everyone I can about like, "this is how this stuff works," in a really basic, easy-to-grasp way. And then let's level it up. Like this is how I can explain it to child, as I explain it to some of the PhD at, you know, Computer Science. And that stuff doesn't work, but the idea that we must keep people safe is a genuine and real one. You know, like if we see a hundred people walk by, and if someone tells you like next week, one of those people is, you know, Jason from the horror movies, and he's going to go to the Camp Crystal Lake and kill a teenager, you're like, "Whoa, I should stop that!" Right? So you're like, "Yeah, okay. Like well, how do we stop him? We don't know which one's Jason?" Well, we could talk to them. We could interview them, we can investigate. We could follow them and see which one's heading to Camp Crystal Lake. But the- with the power like we have, much like the- when I started working on all this stuff, we can just read their emails. We can just listen to their phone calls. We can just, you know, know everything about all one hundred. All their hopes and dreams and fears, and what they're going to do. And is that the price, of violating everyone's privacy and safety and like sense of agency? To protect against a- this one teenager's death, right. And then, okay, let's say this. Let's say you feel like as a society, that is worth the price. You have a hundred. That's a lot to go through. Let's just like say: What about these other ones? Like there's always a marginalized group. In every country, and I learned from travelling the world, there's this group that people don't even remember why, but they're suspicious. They're othered, and their movements are just criminalized. And it's like, "Let's start looking at them first." And then you end up with like not the same level of surveillance, where some people in the community are living in a free or somewhat free world, while others are living in a Terminator 2 dystopian world, because of the levels of surveillance all around them. And criminality and othering that happens to them like, they're basic behaviors. And you have a society that doesn't trust each other, that turns against each other. And in that it's like Batman, you know, like, the Joker is like the criminal is turning to a man they didn't fully understand. You end up turning to people and tools that will end everything you care about. And that's what I learned. Like it's like a prison experiment. You and I are friends, but we decide like, "Oh, let's play prisoner for an hour like, you know, like you act like the guard and I act like the prisoner." At the end of an hour, you're doing stuff you never would have thought ever, because of the power that you're given. And because it's a silent, quiet power, it doesn't feel ugly. There's no blood on your hands. You know, the baton isn't weighing down your fist. And it's a dangerous power. That's one thing that I've learned. And can we create a world where there's an acceptable level of risk? For this huge amount of freedom and agency. And that's why I do this work. 'Cause I say, "Yeah." You know, I think like privacy is not secrecy. Privacy is curtains in your windows.
Balazs: Yeah, right.
Matt: You know what they say like. Privacy is a door, so that you can let someone in and invite them into your home, and show them how clean your room is, you know. Like it allows you to have that.
Balazs: And what would you say: What's the outcome for the members from your community who are attending the CryptoHarlem and the trainings that you do for them? How do you see that? What's the outcome? What happens with them after all that? How do you let them go off to the world? Like okay, can you name a few things that, "Okay, they're more cautious of this, or they are more focused on that." Is there a few things like this?
Matt: Yeah, the outcome is like this: Generationally, they're scared. There's these ideas that we're being criminalized or marginalized communities, right? And now they're empowered, and they know like what's real and what's not real. There's a lot of like bogeyman-under-your-bed-type stuff, and they're myths. And other things, I'm like, "Wow, it's amazing. It's very close to the truth. Let's explain how it works." But more importantly, let's explain who they talk to, who put it there, what you can do to push back against it. How it breaks, you know? Like that is something that leaves people hopeful, and feel like they have control over this world that they've been forced to live in.
Balazs: Part of the efforts that you're doing here is as well fighting against racism, and fighting for equality as well, if my understanding is correct?
Matt: I mean, yeah, that's why I work with the Ford Foundation. Like their mission itself is fighting inequality. Like that's it. It's a pretty interesting mission for a business, you know. And they do that work by supporting, you know, people, organizations... all around the world. You know, there's eleven offices around the world, most of them in what you would call the global south, you know, like India, and African offices. There's like that South-American offices, and... you know, these are places where that power dynamic and that push between the- who is deciding what the status quo is, and who's sitting in seats of power, and who's really just trying to get a boot of their neck to breathe, is very clear. And it's nice to be there, 'cause it's very much in line with my work at CryptoHarlem, you know. Like you're fighting racism, and you realize like the connection to fighting sexism, and fighting, you know, homophobia, and fighting other things- sorry, I got a little New York City soundtrack.
Balazs: Don't worry about that!
Matt: But- and it's really great to speak to audiences, who are ready for this message. They're never like, "I have nothing to hide." They're like, "Where have you been?" You know. Yeah, so, you know, Ford Foundation supports these organizations through bringing them together. Sometimes introducing them to each other, convening them, which is great, because they wouldn't necessarily know, and they're stronger together. We also support them financially to the amount of 500 million a year, which is a hu- large contribution that can turn a small group of people who are trying to do the right thing into an NGO that's working in an office and like really helping people the right way. It's great to do that. And my job there is strengthening the organizations' cybersecurity capacity. It's their understanding of this stuff, their appetite for this, and also knowing what small steps they can take to keep the organization from being wiped out from a cyberstorm.
Balazs: Is there a specific like segment that you're working with? Or a specific industry, or a specific geolocation that you're working with?
Matt: In my particular job, no. It's actually- I work with a small group, by maybe our numbers, of like 370 different organizations from around the world that are part of this department called BUILD. And BUILD is about instead of giving an annual grant and helping them with a project, giving them a grant for five years. Year after year, they know they're going to get the same amount. And looking at things they didn't have time to do before, like, you know, maybe hire someone. Maybe work on their leadership transition plan. Maybe look at governance and human resources. Things that we know and we've identified are important for an organization that's going to be around for generations, right. So, you know, a lot of times, these are directly impacted people, or someone who had a good idea and wanted to like help, and it doesn't mean like they understand like financial resiliency, or all this other back-office-type stuff. And with BUILD, we help them with these decisions that they need to make, 'cause that will affect where they land in five years, and ten years. And with cybersecurity, it's the same thing. It's always something you kind of know you need to deal with, but only when there's a ransomware attack that locked down every machine are you dealing with it. And that's probably when it's too late. You're on your back foot.
Balazs: So for you, is it more like pre-attack area? Or is it more like a post mortem that you have to work on? Normally, with these companies.
Matt: I don't do a post mortem, which is great. Because that takes such a long time. To do it for 400 orgs, I would- I'd be doing it for years and years. So what I do is: I teach them about the attacker mindset, and then organizational security, and what they need to implement. So like I'll say, "Hey, do you have cybersecurity liability insurance?" And they'll be like, "What?" But that's actually really important. It's- when you get a ransomware attack, and the hackers are asking for like, you know, 10,000 dollars in Bitcoin for every five machines... well, you really want to be like, "I have a backup. I'm not gonna use that machine, it's fine."
Matt: And then they'll say, "Well, we also took some data. We learned people don't always pay. And you'll see some of it on this public website." Then they have your attention. And they're like, "We're gonna leak it all, all over the place, if you don't pay." And now you're being extorted. Now if you have insurance, they have someone who's a negotiator. And negotiators are interesting, because they obviously kind of understand both sides – that's how they're able to get the price down. But they'll do it. They'll get the price down for you. They also will like pay a certain percentage of the cost of the- what you'll end up paying to get your computers back.
Matt: And that could be something that allows you to actually keep running, or just stops you dead. But then the second thing I ask them is, you know, "Do you have that security policy?" You know, there's a website called USOAP.app, dot A P P. And like with USOAP, you just fill out a basic questionnaire, and at the end of it... boom! You have a security policy. It's not a customized, super personalized one, but it's yours. And you can add things to it. And it's walking them through that journey. We've created this cybersecurity assessment tool, 'cause I was like, "Listen, I get this question all the time. And there is nothing that looks like the medical tent at a refugee camp, where you can just get a really quick, you know, diagnosis. This person is sick level 1, this one is sick level 2. And then you can have a basic course of support and medical care for them. So with these orgs, they can fill out this survey, which is on our website now, and I developed this and brought in a team of like amazing cybersecurity rock stars, you know. We had Martijn Grooten and Trinh Nguyen, Runa Sandvik, who used to also work at the New York Times, Laura Tich, this dude Matt Hansen, you know, like we're working on making this stuff accessible, so orgs that don't have a lot of capacity can give us a little bit of time, and get a lot from it. And really adopt these things in a way that, as employees come and go, it's part of how your board works, is part of how your documentation works, it's how onboarding and offboarding employees works. So it's there to stay.
Balazs: Yeah. So you're trying to create an education material that's pretty much automized and super simple and super adoptable for every kind of industry, every kind of like size of organization, all around the world. If that's correct?
Matt: Yeah, and in multiple languages. And it's a daunting task.
Balazs: That's fantastic!
Matt: Yeah, yeah. But yeah, thank you! I mean, but we're here to make it work. And obviously, you know, I don't have hubris. I understand like getting a doctor's exam in a Manhattan, you know, doctor's office in New York City, and getting the refugee type exam – you're not going to get the same quality of exam. But it's better than not to start.
Balazs: Right, but it's a start, isn't it?
Matt: Yeah. It's a step. We want to just get them on that journey.
Balazs: Right. Fantastic! Matt, listen, if there was an educational material, a book, so to say, or eBook or anything that you would recommend to anyone who is looking to start this journey, and wash their hands, and think more a bit about their privacy and security, would there be one that you would recommend to anyone?
Matt: Yeah, there's a couple. I would say this group Tactical Tech, they're based out of Berlin, and I used to work there as the like Director of Digital Safety and Privacy. They have this thing, it's a Data Detox Kit. So you go to datadetoxkit.org, I think. And it's just real plain speak, it's in a couple of languages. And it's just like, "Hey, this is what washing your hands looks like." Basic hygiene, right? If you're an organization, or you're working in cybersecurity, and you want to understand what NGOs face, read the Dark Basin Report from a group called Citizen Lab. And it talks about four years of research, and cyber mercenaries, and all this other stuff that- it's not what normal businesses would face. And if you're interested in black folks and marginalized people, and how surveillance has always been part of that community – which is great, because it teaches you about surveillance for all marginalized communities – read Dark Matters by Simone Browne. It's about the surveillance of the black community, and easily adoptable to anywhere in the world. And you learn from like 1800s to 8 days ago, basically.
Balazs: Wow! Very impressive! I'm super keen to get those books and read them, and we're going to link them in the description. So, if anyone is, you know, looking for the exact link, then we're going to do that, so they can find it. Super! Thank you, Matt! It's been a pleasure! I think anyone who listened in, they got quite a lot of knowledge and quite a lot of things they can get started with. They know the place, they know the whats and whereabouts. So I think this was quite a useful, educational piece that you put together and that you explained today. So thank you for that!
Matt: Yeah. Thank you for the platform! And thanks to the people who are listening to "under CTRL" for giving us some time, and, you know, just hearing my voice. I really appreciate it. Just- well, let's keep each other safe. Let's encrypt. And let's keep- you know, keep things moving.
Balazs: Fantastic! And if anyone has a question, we will also link your profile, or the way how they can reach you as well. So, fantastic.
Matt: Yes, a hundred per cent! I'll get back to them.
Matt: Just not fast.
Balazs: Alright, thanks, Matt! Thanks. It's been a pleasure.
Matt: It's been a pleasure, too.