Mark Zuckerberg’s idealistic vision for Facebook has come back to haunt the company.
Facebook hasn’t been selling your data, folks. Instead, it’s been giving it away free — and for a long time, that was part of the plan.
The New York Times published another long and damning investigation into the social giant’s data practices late Tuesday. That story, which was based on hundreds of documents reviewed by the Times, focused on Facebook’s extensive data partnerships with some of the world’s largest tech companies, like Amazon, Apple, Netflix and Microsoft.
“Facebook allowed Microsoft’s Bing search engine to see the names of virtually all Facebook users’ friends without consent, the records show, and gave Netflix and Spotify the ability to read Facebook users’ private messages,” the New York Times reported. “The social network permitted Amazon to obtain users’ names and contact information through their friends, and it let Yahoo view streams of friends’ posts as recently as this summer.”
This story followed similar stories over the summer from the Times and the Wall Street Journal that found Facebook has long given partners special access to user data as a way to expand Facebook’s reach into other services online. New internal emails published earlier this month also confirmed these kinds of deals were in place.
All in all, it sounds like virtually everyone on the internet had access to some kind of Facebook data — which likely means some of your data — over the past eight years. And that’s probably true, in part because this is exactly how Facebook has intentionally operated from the beginning.
These deals, some of them more concerning/problematic than others, reflect the idealistic vision that CEO Mark Zuckerberg had for Facebook back in 2010 — a vision that your social profile wasn’t limited to Facebook’s app, but followed you around the web to personalize all the other experiences you have, too.
Here’s how he described it in an interview with Recode earlier this year shortly after revelations that Facebook’s data policies allowed third parties, like the political research firm Cambridge Analytica, to collect vast amounts of user data without permission.
The vision, if you remember, is to help make apps social. So, the examples we had were, you know, your calendar should have your friend’s birthday. Your address book should have your friend’s picture. In order to do that, you basically need to make it so a person can log into an app and not just port their own data over, but also be able to bring some data from their friends as well. That was the vision, and a bunch of good stuff got created. There were a bunch of games that people liked. Music experiences, things like Spotify Travel, you know, things like Airbnb, they were using it. But there was also a lot of scammy stuff.
Facebook tried to explain away many of these arrangements in a blog post late Tuesday night, but there are a number of problems with these deals, and “scammy stuff” is just one of them.
One problem is that Facebook was sloppy and careless with your data. The New York Times story highlights one type of Facebook partnership around a feature called “Instant Personalization,” which took public Facebook data and let partners use it to create Facebook-like experiences on other websites.
Facebook claims it shut Instant Personalization down in 2014, but it really didn’t. Facebook’s partners stopped integrating this data with their services, but Facebook never shut off the Instant Personalization APIs, the software that allows companies to actually pull the data from Facebook’s servers. That means partners could still technically collect this public data even after the program was “shut down.”
Facebook argues that the data was public anyway and that it has “no evidence that data was used or misused after the program was shut down.” But failing to cut access to that data was a significant and inexcusable oversight for a company that claims data privacy is its top priority.
Another major problem, perhaps the most troubling, is that most Facebook users had no idea these partnerships exist, and Facebook may not even know the scope of how your personal data is used considering how many other companies have access to it.
Facebook said these deals are legit because these are trusted partners. Many of them “did not require the social network to secure users’ consent before sharing data because Facebook considered the partners extensions of itself — service providers that allowed users to interact with their Facebook friends,” the New York Times reported. “The partners were prohibited from using the personal information for other purposes.”
That’s a lot of trust to put into other companies, some of which are direct competitors. Spotify had access to users’ private messages, for example, which sounds more ominous than the likely reality. If you want to share what you’re listening to with your Facebook friends from Spotify, the company needs access to your private messages to enable that. Facebook didn’t simply open up your messaging inbox for Spotify’s reading pleasure.
But can Facebook say with total confidence that all partners who may have had this kind of access — or access to your friends list or access to your email and phone number — never abused that data? The answer appears to be no. Facebook told the New York Times it managed these partners closely, but audited them rarely.
If Cambridge Analytica taught us anything, it’s that Facebook’s cavalier attitude toward user data in the company’s early days was naive and dangerous. And it’s still paying for that attitude. Zuckerberg’s vision for an internet where Facebook was ubiquitous had serious privacy trade-offs that the company didn’t understand at the time — and still doesn’t seem to fully understand.
During our interview in March, Zuckerberg claimed that he gets it now.
“You know, frankly, I just got that wrong,” he said of his plan for all apps to have a Facebook social layer. “I was maybe too idealistic on the side of data portability, that it would create more good experiences. And it created some, but I think what the clear feedback was from our community was that people value privacy a lot more. And they would rather have their data locked down and be sure that nothing bad will ever happen to it than be able to easily take it and have social experiences in other places.”
It’s one thing to say you understand, and it’s another to show you understand. Zuckerberg’s vision for what Facebook could have become is clearly not what people want. And it’s time for Facebook to do something about it.
Recode – All Go to Source
Powered by WPeMatico