Cambridge Analytica Files: We are livestock on Facebook’s shelves
The Cambridge Analytica Files prove that Facebook’s business model is privacy violation by design. For me, these are the 4 biggest lessons of the unfolding story.
1. Facebook’s power over data is a threat to democracy. Regulations should aim to stop this kind of monopoly.
The Cambridge Analytica Files is yet another reminder that due to the amount of data Facebook accesses, they have incredible power over our lives. The potential to influence people’s behavior, countries’ elections and democratic processes makes Facebook so much more than a social media company. Having this amount of power concentrated in one company’s hand is a threat to the freedom of thought and to democracy. This monopoly should be stopped, or at least strictly regulated by regulations. No company should have as much power as Facebook does.
2. We need a technology paradigm shift from maximizing data available for companies to minimizing it.
Besides regulatory means, there are technologies that prevent companies from accessing and harvesting people’s data. Privacy by design technologies that hide user data from services themselves, such as zero-knowledge end-to-end encryption or differential privacy, are fundamental building blocks of a democracy from a technology perspective. They are limiting the power of companies and giving back the control over data to their owners themselves, to citizens and consumers. As long as companies have your readable, easily accessible data and build their business models on selling it, you are simple livestock for them and not a client. Their clients are the advertisers, let them be companies, organizations or political parties.
3. It’s not a Facebook “data breach”. This is Facebook by design.
The personality quiz app that Cambridge Analytica got their data from used Facebook’s “Friends Permission” feature. This provided developers access to the consenting users’ friend network, too, who didn’t know at all that their data was obtained. That is how Cambridge Analytica could harvest the data of 50 million people. Despite sanctioning in their terms of use, Facebook made this feature available and even promoted it. It was used by many developers for many years without any consequences, according to a previous Facebook manager. Max Schrems made a case against this practice already in 2011, however, Facebook only terminated this option in 2014. Facebook’s Chief Security Officer, Alex Stamos was right, this is not a data breach where hackers broke systems and accessed hidden information. This is how Facebook’s service was designed.
4. Regulations like the GDPR are needed to make data protection a must for all companies and restore people’s trust in digital services
If you fill a Facebook test about your personality, you don’t have the natural expectation that your results will be sold and used as a cultural weapon to manipulate people to vote for certain parties. However, the personality quiz app did exactly the same when selling the dataset to Cambridge Analytica. This is something that GDPR addresses and sanctions, providing citizens strong control over their data. If companies, let them be smaller firms like Cambridge Analytica or giants like Facebook are misusing data in ways like this, people will lose trust in digital services. I’m sure that there will be other similar stories and breaking news incidents. If so, people will not trust any of the digital services and the digital economy collapses, along with its many advantages. By creating stricter than ever requirements on data protection for companies, the GDPR is a huge opportunity to get companies change their practices and so to rebuild the disappearing trust of consumers.
Will I #deletefacebook now? My usage is already extremely limited. I never like or react to anything. I never connect my Facebook account to any third-party apps. I only post news occasionally that are relevant to me for professional purposes. Shall you delete it? It’s up to you…