Zuckerberg to BBC in 2009: We won’t sell personal information — Video, poll
Although Facebook CEO Mark Zuckerberg admitted that his company “made mistakes” that led to millions of users having their data exploited by other companies, a 2009 video emerged showing the Silicon Valley billionaire telling the BBC that users — not the social media giant — own their personal data and that such data would never be sold.
"This is their information. They own it"
"And you won’t sell it?"
"No! Of course not."
Facebook CEO, Mark Zuckerberg, talking to the BBC in 2009. pic.twitter.com/mVrhp0TpIS— BBC Business (@BBCBusiness) March 20, 2018
At the time, he declared that the “person who’s putting the content on Facebook always owns” the information. And he promised that it wouldn’t get sold or be shared with anyone other than whomever the user wanted.
At the time, Facebook only had 170 million users. The site now boasts over two billion. And the terms have changed — considerably.
Will this presidential election be the most important in American history?
Here’s Zuckerberg’s full statement, as posted by the BBC:
I want to share an update on the Cambridge Analytica situation—including the steps we’ve already taken and our next steps to address this important issue.
“We have a responsibility to protect your data, and if we can’t then we don’t deserve to serve you. I’ve been working to understand exactly what happened and how to make sure this doesn’t happen again. The good news is that the most important actions to prevent this from happening again today we have already taken years ago. But we also made mistakes, there’s more to do, and we need to step up and do it.
“Here’s a timeline of the events:
“In 2007, we launched the Facebook Platform with the vision that more apps should be social. Your calendar should be able to show your friends’ birthdays, your maps should show where your friends live, and your address book should show their pictures. To do this, we enabled people to log into apps and share who their friends were and some information about them.
“In 2013, a Cambridge University researcher named Aleksandr Kogan created a personality quiz app. It was installed by around 300,000 people who shared their data as well as some of their friends’ data. Given the way our platform worked at the time this meant Kogan was able to access tens of millions of their friends’ data.
“In 2014, to prevent abusive apps, we announced that we were changing the entire platform to dramatically limit the data apps could access. Most importantly, apps like Kogan’s could no longer ask for data about a person’s friends unless their friends had also authorized the app. We also required developers to get approval from us before they could request any sensitive data from people. These actions would prevent any app like Kogan’s from being able to access so much data today.
“In 2015, we learned from journalists at The Guardian that Kogan had shared data from his app with Cambridge Analytica. It is against our policies for developers to share data without people’s consent, so we immediately banned Kogan’s app from our platform, and demanded that Kogan and Cambridge Analytica formally certify that they had deleted all improperly acquired data. They provided these certifications.
“Last week, we learned from The Guardian, The New York Times and Channel 4 that Cambridge Analytica may not have deleted the data as they had certified. We immediately banned them from using any of our services. Cambridge Analytica claims they have already deleted the data and has agreed to a forensic audit by a firm we hired to confirm this. We’re also working with regulators as they investigate what happened.
“This was a breach of trust between Kogan, Cambridge Analytica and Facebook. But it was also a breach of trust between Facebook and the people who share their data with us and expect us to protect it. We need to fix that.
“In this case, we already took the most important steps a few years ago in 2014 to prevent bad actors from accessing people’s information in this way. But there’s more we need to do and I’ll outline those steps here:
“First, we will investigate all apps that had access to large amounts of information before we changed our platform to dramatically reduce data access in 2014, and we will conduct a full audit of any app with suspicious activity. We will ban any developer from our platform that does not agree to a thorough audit. And if we find developers that misused personally identifiable information, we will ban them and tell everyone affected by those apps. That includes people whose data Kogan misused here as well.
“Second, we will restrict developers’ data access even further to prevent other kinds of abuse. For example, we will remove developers’ access to your data if you haven’t used their app in 3 months. We will reduce the data you give an app when you sign in—to only your name, profile photo, and email address. We’ll require developers to not only get approval but also sign a contract in order to ask anyone for access to their posts or other private data. And we’ll have more changes to share in the next few days.
“Third, we want to make sure you understand which apps you’ve allowed to access your data. In the next month, we will show everyone a tool at the top of your News Feed with the apps you’ve used and an easy way to revoke those apps’ permissions to your data. We already have a tool to do this in your privacy settings, and now we will put this tool at the top of your News Feed to make sure everyone sees it.
“Beyond the steps we had already taken in 2014, I believe these are the next steps we must take to continue to secure our platform.
“I started Facebook, and at the end of the day I’m responsible for what happens on our platform. I’m serious about doing what it takes to protect our community. While this specific issue involving Cambridge Analytica should no longer happen with new apps today, that doesn’t change what happened in the past. We will learn from this experience to secure our platform further and make our community safer for everyone going forward.
“I want to thank all of you who continue to believe in our mission and work to build this community together. I know it takes longer to fix all these issues than we’d like, but I promise you we’ll work through this and build a better service over the long term.”
The observant reader may notice a significant gap in the timeline Zuckerberg presented — specifically the period between 2007 and 2013. It was during that time that Barack Obama won not one, but two terms to the White House.
According to ABC:
President Barack Obama’s 2012 re-election campaign mined supporters’ personal data from Facebook to benefit its voter turnout program. But former campaign officials said Wednesday they accessed and used the information in vastly different ways than Cambridge Analytica, the firm connected to President Donald Trump’s 2016 campaign accused of improperly lifting data on 50 million Facebook users.
Former Obama advisers said they collected the data with their own app, complied with the social media platform’s terms of service and received permission from supporters. An estimated 1 million Obama supporters gave the campaign access to their Facebook data.
In most cases, Obama supporters who signed on to the campaign’s mailing list were asked to authorize the campaign’s Facebook app, allowing it to access certain aspects of their profile, including their posts, likes, photos, demographics and similar data from their Facebook friends. The Obama data was used in voter turnout efforts, with a focus on young voters in key battleground states, and former campaign officials said the data was kept secure and not sold to or acquired from third parties.
To address the problems, the BBC said Zuckerberg promised to:
- investigate all Facebook apps that had access to large amounts of information before the platform was changed “to dramatically reduce data access” in 2014
- conduct a “full forensic audit” of any app with suspicious activity
- ban any developer that did not agree to a thorough audit
- ban developers that had misused personally identifiable information, and “tell everyone affected by those apps”
In the future, he said the company would:
- restrict developers’ data access “even further” to prevent other kinds of abuse
- remove developers’ access to a user’s data if the user hadn’t activated the developer’s app for three months
- reduce the data that users give an app when they sign in to just name, profile photo, and email address
- require developers to obtain approval and also sign a contract in order to ask anyone for access to their posts or other private data
“While this specific issue involving Cambridge Analytica should no longer happen with new apps today, that doesn’t change what happened in the past,” Zuckerberg said. “We will learn from this experience to secure our platform further and make our community safer for everyone going forward.”
The BBC’s Dave Lee apparently wasn’t impressed with what he heard. In his analysis of the statement, he noted:
I read one thing loud and clear from Mr Zuckerberg’s initial statement: Facebook is not prepared to take the blame for what has happened.
Contrition has never been Mr Zuckerberg’s strong point, and the statement, days in the making, was no different.
No apology to users, investors or staff over how this incident was allowed to happen by the data policies in place at the time.
No explanation as to why, after learning its data was being abused like this in 2014, it opted to give the companies a telling off instead of banning them outright.
No reasoning as to why Facebook failed to inform users their data may have been affected. Technically, it still hasn’t.
Mr Zuckerberg’s words were not an explanation, but a legal and political defence. This company knows it is heading into battle on multiple fronts.
Meanwhile, the US Federal Trade Commission has reportedly opened an investigation and the head of the European Parliament said it wants to conduct an investigation. The BBC also noted that a UK parliamentary committee has called for Zuckerberg to provide evidence about its use of personal data.
And, Arutz Sheva said, the Israeli Privacy Protection Authority announced that it “informed Facebook today that it had opened an investigation into its activities, following the publications on the transfers of personal data from Facebook to Cambridge Analytica, and the possibility of other infringements of the privacy law regarding Israelis.”
Here’s a YouTube video of Zuckerberg’s 2009 statement:
And we want to hear from you: Do you believe Zuckerberg and do you trust him and his company to handle your data properly? Let us know in the poll below and we’ll announce the results on our Facebook page this weekend.
Related:
- Report: Hillary wants to know if Cambridge Analytica worked with Russians to derail campaign
- Hypocrites: Hollywood leftists freak out, call for boycotts of Facebook — Said nothing when Obama used site to win election
- Analysis: Facebook’s algorithm changes help CNN, liberal sites, hurts conservative sites
- Congress needs to rein in out-of-control social media giants while there’s still time
- Facebook under fire after asking if pedophiles should be allowed to request ‘sexual pictures’ from kids
If you haven’t checked out and liked our Facebook page, please go here and do so. And be sure to check out our new MeWe page here.
If you appreciate independent conservative reports like this, please go here and support us on Patreon.
And if you’re as concerned about online censorship as we are, go here and order this book: