Do not #DeleteFacebook because of Cambridge Analytica

Chris Saad
9 min readMar 26, 2018

The ability to log into new apps with your Facebook or Google account and share some of your data is called data portability.

Data portability has become so fundamental to the way the open Internet works that it is taken for granted by most users and it is even enshrined in a new European law known as GDPR.

Today, however, the backlash against Facebook for its complicity in Cambridge Analytica’s use of FB data to affect the outcome of the 2016 US Presidential Election threatens to undermine and roll back some of the gains that we’ve all made for an open and interoperable Web.

Facebook is being forced to apologize, but I think there’s a lot about the backlash that misses the point entirely.

Why is this a big deal?

Data portability is an essential, hard-won right of Internet users.

This right is based on the premise that users, not Facebook, own their data. This is a powerful and important idea. Data portability means that Internet companies aren’t allowed to lock users into their walled gardens. Instead they should provide ways for users to quickly and easily share their data with new apps they might want to use. This allows independent up-and-coming developers to build new kinds of innovative products and businesses. Ultimately this makes our lives and the Internet a richer and more open place.

Crucially, this is all done with full control and permission from the user.

The flip side of this is that those new apps have to be trustworthy with the data they receive. If they are not, it’s those bad apps that are to blame — not Facebook or the big ecosystem of services that mostly treat your data with respect.

How does it work?

You’ve probably participated in this data exchange many times.

You’ve likely clicked a ‘Login with Facebook’ button on a new app. Then you’ve been presented with a screen that explains that you’re about to share your data. You then clicked ‘allow’ for the exchange to continue.

This login and allow process is the method by which you ask services like Facebook to share your data with a 3rd party app so that it can give you some kind of new user experience or value. The app might be a game, a new kind of social network, a news site etc. It’s a similar process for countless other services that have a lot of your data and identity like Twitter, Instagram, Google etc.

But didn’t Facebook sell all our data to Cambridge Analytica?

No. Sadly much of the Cambridge Analytica controversy is the result of conflating many related but separate things together inciting a lot of fear, uncertainty and outrage.

Below, I will attempt to tease out and clarify some of these separate issues as succinctly and clearly as I can.

First, some history

Above I wrote that data portability was a hard-won right. This is because while, today, a user’s right and ability to share their data is taken for granted, in 2007, it was not obvious or inevitable.

While there were small groups of passionate, independent engineers working on great technology to make data portability possible — technologies like OpenID, OAuth and Portable Contacts — users were not aware of the value of their data and companies were unwilling to let their users port their data to other services.

The DataPortability Project Logo

To address this, I co-founded the DataPortability project with some of my friends. It was a non-profit initiative that worked to educate users about the value of their data while we pushed social networks to give their users the ability to connect, control, share and remix it for their own purpose. It reached its peak when, in January of 2008, we announced that Google and Facebook were joining the initiative.

To put a fine point on it, me and my friends were outspoken critics of social networks like Facebook for being one way data sinkholes. So my defense of them here might come as a surprise to some.

Thanks to its activity, the DataPortability Project helped to move the conversation forward in meaningful ways. It popularized the term data portability, raised awareness of its value, fired the imagination and forced uncomfortable conversations about user rights and data ownership between executives, technology leaders and journalists.

Ultimately it directly lead to the creation and adoption of features like the ‘Login with Facebook’ workflow I described above.

Selling data vs. data portability

Right now, many people are complaining that Facebook sold their data to Cambridge Analytica. This is not true.

Rather, Cambridge Analytica purchased the data from another company who had built a Quiz app that went viral. This Quiz app had the data because they built something that many users decided was fun and interesting to try. Upon hearing about it, they each instructed Facebook to share their data (and some of their friend’s data) with the Quiz app. This was something each user asked for and Facebook — as part of their compliance with the spirit of data portability and user rights — did what it was told. Facebook did not, and does not charge for this.

Further, the data was transferred from Facebook to the Quiz app under clear and standardized Facebook data usage policies that state that the Quiz app (and any other 3rd party app that engaged in the same process) must not store the data long-term or to sell it to 3rd parties. These policies are intended to (in so far as is possible) protect users from unforeseen uses of their data.

So what, in fact, has happened is that Facebook users trusted a 3rd party app with their data, and that app violated their trust.

Like with most things, there is a tension between Freedom/Rights and Responsibilities. Data portability gives users the right to share their data with apps they trust. But it also gives them the responsibility to choose apps worthy of that trust.

In this case, the Quiz app violated that trust (and the terms of service that FB imposed on them). While this is terrible and should be cause for litigation, it should not — as has been suggested — be a reason to roll back Facebook’s support of data portability.

Data and Attention based business models

The issue of data portability has also been conflated with the question of how an app should ethically use your data once it has possession of it.

Many are upset about the notion that if an app is free, then it is likely selling you and your data to other companies — in particular, advertisers.

The cartoon above illustrates this point better that I’ve ever been able to in 10 years of trying.

This is only partly true. Crucially, selling your Data and selling your Attention to advertisers are two different things.

Selling your data involves companies giving large portions of their data-set wholesale to other companies without your explicit permission. To be clear, this is not data portability because it is not done at the user’s request or with their direct consent. It is also not what Facebook did in this case.

However it is exactly what the Quiz app did! They wholesale sold the data they obtained from Facebook (with user permission) to Cambridge Analytica (without user permission). This is against FB’s Terms of Service.

Selling your attention is different. Your data is never actually given to the advertiser. Instead, Facebook, Google, Twitter etc, each have tools that allow companies to upload and target ads at you based on your profile data. The advertiser never actually gets your data, though. This is common and accepted practice for free online apps (not to mention radio stations, TV networks, roadside billboards, magazines etc).

Personally, I find selling data abhorrent.

Selling user attention via ads, however, is a fairly straightforward and time tested business model. Many users understand the implicit trade-off they’re making. Even if they can’t articulate it in technical or economic terms, and even if they may not know the full breadth and sophistication of the tools, they basically understand that they are getting free utility in return for seeing highly targeted advertising.

Neither of these things, however, are what Facebook did when it came to Cambridge Analytica.

Ethical uses of data, memes and targeting

Another aspect of this debate is the question of the ethical use of data that a company has legal and appropriate possession of.

Uses such as psychographic and demographic targeting, weaponizing memes, fake or exaggerated news are all ways in which your data — irrespective of how it has been obtained — can be used against you.

This is further complicated when foreign state actors get involved and democratic processes are materially affected.

Countless articles and analysis about this have already been written. My main interest here is to point out that this a much broader discussion about what we want to allow in our societies and how we can protect against foreign actors influencing our elections.

This is about issues of free speech, marketing ethics, geopolitics and war. It is not — strictly speaking — about data portability or even Facebook.

It’s also about education, but that’s a subject for another post.

What can be done to get this right?

While, in this case, Facebook is not responsible for selling user data to Cambridge Analytica, there is always more they can do to better safeguard and educate users. Some of these things are alluded to by Mark Zuckerberg in his apology (included in the image towards the top of this post). They include…

  • Introduce new, lower tiers of data sharing for apps that only need user IDs but don’t need access to any of a user’s profile.
  • Better audit the data access rights each app is requesting and ensure they are necessary for the proper functioning of the app.
  • Regularly remind users of the apps they have connected to Facebook and encourage them to perform some housekeeping.
  • Look out for apps that are getting a great deal of traction and ensure they are acting in good faith.
  • Better explain to users how the data exchange works.
  • Separate the Login flow from the data sharing flows — encourage users to pick and choose the data they share only when it’s needed for certain app features. This is similar to how iOS have to ask for camera access only when they need it (thanks to David Weekly for this idea)
  • Aggressively pursue companies that violate their data retention and re-sell policies to make the consequences prohibitively expensive.

What FB should not do, however, is remove (or make it prohibitively hard) the ability for legitimate, independent apps to access your rich profile data so that they can innovate and compete on an open Internet.

At the end of the day, though, there will always be some risk to user’s data. We must all take some responsibility for the apps we trust and choose to share our data with. Every right comes with responsibilities.

Finally, I’d like to suggest that we all need to more carefully choose our targets when it comes to outrage and mob justice. We live in difficult and uncertain times. We must carefully aim our anger and action so that the real enemies of progressive society are properly identified and dealt with — and that those dealings are effective. Facebook has clearly stepped up to take ownership for their part in recent events — have others?

Update: I’ve written a follow up piece to this about the broader trend of outrage driven by false narratives. The top 10 questions to ask ourselves before getting outraged by the news.

Special thanks to Nik Seirlis, Ben Metcalfe, Josh Elman, Ben Parr, Aliya Sana and Ashley Angell for providing feedback on this post.

--

--

Chris Saad

Startup & Product Builder. Strategic Advisor. Author & Podcaster. Former Head of Product @ Uber Dev Platform.