You Trusted Facebook: Own Up to It
Facebook collected massive amounts of data from us, with our permission. When we think about regulating it, we need to ask: how much do we need to be protected from ourselves?
5 min read
This story originally appeared on PCMag
The violent reaction to what’s going on with Facebook makes me think about, of all things, PETA.
Yes, the People for the Ethical Treatment of Animals — the vegan, borderline eco-terrorists who have spent decades screaming at us about how cruel our hamburgers are. They’re right, you know. Factory farming is cruel and heartless, our food system is a mess, and switching to eating mostly leaves would be better for all of us. I am about to go eat some chicken soup, and I am going to have some willful blindness about the conditions in which my chicken soup was created, because I have to live in this world and I can’t let it bother me too much.
So I survive, eating my chicken soup, vaguely knowing that I agreed to a complex system of oppression by doing it. And the same goes for Facebook, where everyone is shocked — absolutely shocked! — that data collection was going on here.
The current Facebook scandal started because a consultancy linked to political campaigns, Cambridge Analytica, used an academic’s personality quiz to suck down the personal details of millions of people who didn’t agree to take the quiz, and used that information to target them for political messages. Everyone agrees the loophole Cambridge used was bad, and it was closed in 2015.
The story has snowballed, though, into people realizing just how much data Facebook itself holds about them, and getting really uncomfortable with it. If you’re on an Android phone and clicked “yes” when Facebook asked to read your contacts and send text messages, for instance, it started collecting the times, dates and destinations (but not the contents) of all the calls you made and texts you sent.
Beyond that, Facebook uses our every scroll, click and “like” to assemble a full picture of our every proclivity, which it then pimps out in slightly veiled form to advertisers. Cambridge’s real sin was stealing Facebook’s trick without Facebook’s permission, but Cambridge just did what Facebook does all day. Those of us in the tech world have known for years that Facebook does this. It’s just what Facebook does.
As the old saw goes, “If you’re not paying for the product, you are the product.”
You agreed to this
You are not a pure victim here. You agreed to this. Maybe you didn’t know what you were agreeing to, but you clicked “yes” when the happy robot asked to suck down your contact book and insinuate itself into your phone. And yes, for most of you, it asked. If you clicked “skip” instead, good for you! Your black bean soup is just as tasty as my chicken, and much healthier.
You didn’t do the research, or you ignored the signs, not because you trusted or didn’t trust Facebook, but because you pretty much didn’t care. Unlike security researchers who are hyper-conscious of their personal data, you considered yourself to be someone who basically doesn’t matter: you have nothing to hide, so there’s no need to hide it.
And now you’ve been led on a tour of the pig farm, shown the sows in their tiny little boxes squealing in pain, taken on a romantic walk by the giant lake of hog waste, and you’re reconsidering your bacon. I also still eat bacon.
There’s nothing morally wrong, in my view, with reconsidering life choices when forced to face the things in which we’re complicit. The trouble comes when we paint ourselves as pure victims, only dupes, and don’t face up to the willful blindnesses and bad choices which let us be duped.
The role of regulation
A lot of us are doing a lot of unhealthy things, and we probably aren’t about to stop, even though we know they’re unhealthy. I could go home tonight and post a picture of a big bacon cheeseburger to Facebook and I’d be hurting myself at least three different ways.
Society has decided that there’s an acceptable level of hurt that we, and the companies that supply us, are willing to accept. I can assume, for instance, that my bacon hasn’t been poisoned at the factory. I didn’t have to read through a EULA to know that.
If I want, I could read some labels, pick the healthiest bacon, or try to convince my family to switch to Ello, but society promises me there will be a minimum level of safety in my supermarket bacon. On the other hand, once I commit to the bacon, I can legally eat it until I kill myself.
It’s time to move from shock to action. Let’s acknowledge that we want to socially network, and it’ll lead us to make some choices that aren’t healthy. Let’s acknowledge that we all make those choices; they aren’t forced upon us. Now let’s discuss where we want the guardrails, and when we should be allowed to give ourselves heart attacks.