Will Losing a Half Billion Users Data Get Facebook a Pat on the Back?
Privacy is very, very important. It's far too important to leave up to the whims of corporate executives, especially *Facebook* executives
By Cory Doctorow / Pluralistic
Facebook has *such* a sweet racket. First, they used the Roach Motel model - data checks in, but it doesn't check out - to trap you and all your friends in a mutual hostage-taking situation, where you can't leave because they're there, and they can't leave because you're there.
All those address books they imported, the data they gathered from publishers' websites through the Like buttons (which gather data whether or not you click them), the data they bought or snaffled up through free mobile Software Development Kits is now permanently siloed inside of FB.
FB is a walled garden: when you leave, you leave behind your friends and communities - you can't switch to a Diaspora instance or even Twitter and exchange messages with FB.
To their credit, millennials hated this shit, especially once their parents started joining FB and friending them. Those smart kids all bailed for Instagram.
So Facebook bought Instagram, explicitly to ensure that wherever you went, you'd still be in the Zuckersphere.
Facebook's surveillance data isn't that valuable, so it has to gather a *lot* of it. Most of its ad-tech advantage is just fraud: lying to advertisers about who saw its ads, lying to publishers about which kinds of content generate the most revenue.
The data advantage itself is very short-lived; for example, location data is highly prized by advertisers who want to show you an ad for shoes while you're outside a shoe-store. This value is annihilated as soon as you move somewhere else.
Data isn't the new oil, it's the new oily rag: a low-grade waste-product that is only valuable when it is piled up in such vast quantities that it poses an existential, civilization-ending danger.
Facebook's insistence on warehousing all the world's oily rags means things are on fire all the fucking time. They realized this a long time ago and worked out an unbeatable strategy for making sarsaparilla out of SARS: Facebook declared itself to be the world's firefighter.
"Hey guys! Have you noticed that the world is full of arsonists who want to set our oily rags on fire?! Have no fear! Collecting and warehousing oily rags made us so big and powerful that we - and only we - can stop them!"
The corollary of which is usually (but not always) unspoken: "If you take away our power - make us smaller, force us to release our hostages - then we won't be able to stop the arsonists any more."
Cambridge Analytica didn't *abuse* Facebook, they *used* Facebook - used the services that FB had set up and marketed to political dirty tricksters to disseminate disinformation. That was the system working as intended.
FB used the we-fight-arson wheeze to come out of the Cambridge Analytica scandal stronger and more powerful than ever: they shut down the Application Programming Interfaces that potential future Facebook competitors used to help people escape its walled garden, claiming it was an act of firefighting.
Under the same guise, they've threatened legal action against NYU's Ad Observer project, which teams up with FB users to scrape the ads they're served and verify whether FB is living up to its own promises to block paid political misinformation:
Get that? A group of activist-academics and a collective of FB users teamed up to hold FB to account on its own firefighting promises, and FB is threatening to sue them into a smouldering crater... in the name of protecting users.
Here's a thing: Facebook lost control of 553 million user profiles: "phone numbers, full names, location, email address, and biographical information." They went up for sale on hacker forums months ago, but now they're free.
The data was scraped by unscrupulous actors and is being disseminated to commit crimes against half a billion FB users, the "dumb fucks" Zuckerberg derided because they "trust me."
I bet you a testicle* that Facebook's response to this will be to decry the efforts in the UK, EU and USA to force Facebook to interoperate with other online services, including co-ops and nonprofits. (*not one of mine)
They will use the fact that the lost control of user-data and kicked off years - decades! - of fraud against a group of people outnumbering the combined populations of the US, Mexico and Canada - as evidence that they should continue to be entrusted with that data.
They will say that any rule that forces them to open their data up to third parties will make their job transcendentally hard; only they can extinguish the oily rag wildfires.
In other words: creating a problem makes you uniquely qualified to solve it.
This is radioactively self-serving bullshit.
Not because privacy isn't important. Privacy is very, very important. It's far too important to leave up to the whims of corporate executives, especially *Facebook* executives. That's some next-level fox/hen-house stupidity.
The right way to establish the boundaries for data-handling isn't to give Facebook unlimited power to exclude competitors and hold its users hostage.
It's to encourage interoperability and block anticompetitive mergers, so there are lots of places FB users can go without giving up their social connections.
The way you prevent bad actors from interoperating with Facebook in ways that are harmful to users is by creating a federal privacy law, with a private right of action, which allows users to sue companies that violate their privacy.
All forms of interop are potential sources of liberation from Facebook's historically unprecedented hostage-taking, including scraping and other forms of Competitive Compatibility.
As my EFF colleague Bennett Cyphers and I detailed in our paper in Feb, we *can* have privacy without monopoly: