For much of the past week, Facebook has been embroiled in a controversy involving Cambridge Analytica, a political consulting firm with ties to Donald J. Trump’s 2016 presidential campaign, and how the firm improperly got and exploited individual data from 50 million Facebook users.
On Wednesday, following widespread questions about his whereabouts, Mark Zuckerberg, the chief executive of Facebook, spoke with two New York Times reporters, Sheera Frenkel and Kevin Roose, about the controversy and the steps he was taking to make the social network less prone to abuse.
Below is a transcript of the conversation, edited for length and clarity.
Mark Zuckerberg: Privacy problems have always been incredibly important to people. One of our largest responsibilities is to protect data. If you think about what our services are, at their most basic level, you put some content into a service, whether it’s a photo or a video or a text message — whether it’s Facebook or WhatsApp or Instagram — and you’re trusting that that content is going to be shared with the people you want to share it with. Whenever there’s an issue where someone’s data gets passed to someone who the rules of the system shouldn’t have allowed it to, that’s rightfully a big issue and deserves to be a big uproar.
Frenkel: It took quite a few days for your response to come out. Is that because you were weighing these three action points that you noted in your post?
Zuckerberg: The first thing is, I really wanted to make sure we had a full and accurate understanding of everything that happened. I know that there was a lot of pressure to speak sooner, but my assessment was that it was more important that what we said was fully accurate.
The second thing is, the most important thing is that we fix this system so that problems like this don’t happen again. It’s not like there aren’t going to be other different kind of things we’ll also have to fix. But when there’s a certain problem, we have a responsibility to at least make sure we resolve that problem.
So the actions here that we’re going to do involve first, dramatically reducing the amount of data that developers have access to, so that apps and developers can’t do what Kogan did here. The most important actions there we actually took three or four years ago, in 2014. But when we examined the systems this week, there were certainly other things we felt we should lock down, too. So we’re going ahead and doing that.
Even if you solve the problem going forward, there’s still this issue of: Are there other Cambridge Analyticas out there, or other Kogans who, when the platform worked a certain way in the past, were there apps which could have gotten access to more information, and potentially sold it without us knowing, or done something that violated people’s trust? We also need to make sure we get that under control. That’s why we spent a lot of time figuring out, O.K. here’s what it’s going to take to do a full investigation of every app that got access to a large amount of information before we changed the platform policies to dramatically reduce the data access that developers had. For any app that we uncover that has any suspicious activity, we’re going to go do a full forensic audit, and make sure we have the capacity to do that, to make sure that other developers aren’t doing what Kogan did here.
The third thing is, it’s really important that people know what apps they’ve authorized. A lot of people have been on Facebook now for five or 10 years, and sometimes you signed into an app a long time ago and you may have forgotten about that. So one of the steps we’re taking is making it so apps can no longer access data after you haven’t used them for three months.
But it’s also just really important to put in front of people a tool of, here are all the apps you’ve connected to and authorized, here’s an easy way to deauthorize them, to revoke their permission to get access to your activity.
Kevin Roose: Is Facebook planning to notify the 50 million users whose data was shared with Cambridge Analytica?
Zuckerberg: Yes. We’re going to tell anyone whose data may have been shared.
Now, there’s a question of whether we have the exact record in our systems today of who your friends were on that day when there was access three and a half or four years ago, so we’re going to be conservative on that and try to tell anyone whose data may have been affected, even if we don’t know for certain that they were. It’s likely that we’ll build a tool like we did with the Russian misinformation investigation, that anyone can go to it and see if their data was affected by this.
Roose: Do you have a preliminary estimate of how many apps you’ll be investigating?
Zuckerberg: It will be in the thousands.
Frenkel: Were those app developers notified that you’ll be investigating this yet?
Zuckerberg: Just when I posted. And we’ll be reaching out in the near term.
Frenkel: Are you going to be hiring people to help conduct those investigations?
Zuckerberg: Yes, I would imagine we’re going to have to grow the team to work on this.
Roose: You mentioned a contract that developers will have to sign in order to ask anyone for access to broader profile information. What will be the terms of that contract, and what will be the penalties for violating it?
Zuckerberg: So, the important thing there is that it’s a high-touch process. The specific point we were trying to make is that it’s not going to be some terms of service that a developer can sign up for just on their computer when developing something. I guess technically, that would be a contract as well.
The point of what we’re trying to do here is to create a situation where we have a real person-to-person relationship with any developer who is asking for the most sensitive data. That doesn’t mean that — if you’re a developer and you want to put Facebook Login on your website, you can do that. If you want to get access to ask people for their religious affiliation, or their sexual orientation, for data that could be very sensitive, we want to make sure we have a clear relationship with those people.
Frenkel: We understood that Cambridge Analytica had reached out to Facebook and asked that its ban on the platform be reconsidered. Are you giving any thought to allowing Cambridge Analytica back in?
Zuckerberg: The first thing we need to do is conduct this full forensic audit of the firm, that they don’t have any people’s data from our community and that they’ve deleted anything, including derivative data, that they might have. We’re working with the regulator in the U.K. on this, so our forensic audit was actually paused in the near term to cede the way for the ICO there to do their own government investigation. We’re certainly not going to consider letting them back onto the platform until we have full confirmation that there’s no wrongdoing here.
Roose: There were reports as far back as 2015 that Cambridge Analytica had access to this data set. Why didn’t you suspend them then?
Zuckerberg: So, we actually heard, I think it was at the end of 2015 — some journalists from The Guardian reached out to us and told us what you just said. And it was not just about Cambridge Analytica, it was about this developer, Aleksandr Kogan, who had shared data with them.
We took action immediately at that point. We banned Kogan’s app from the platform, we demanded that Kogan and Cambridge Analytica and a couple other parties that Kogan had shared the data with would legally certify that they didn’t have the data, and weren’t using it in any of their operations. They gave us that formal certification. At the time, they told us they never had gotten access to raw Facebook data, so we made that decision.
Frenkel: In retrospect, do you wish you had demanded proof that the data had been deleted?
Zuckerberg: Yes. They gave us a formal and legal certification, and it seems at this point that that was false.
Again, we haven’t done our full investigation and audit yet so I can’t say definitively that they actually have data. I’ve just read all the same reports that you have, including in The New York Times, that says that journalists have seen evidence that they have the data, which is a strong enough signal for us to go on, and take action here.
That’s the basic driver behind us now needing to go and do a full investigation into any app that had access to a large amount of data before we locked down the platform policies in 2014. Just having folks tell us that they were using the data correctly, I think, does not satisfy our responsibility to our community to protect their data.
Frenkel: Are you actively looking at some of these dark web data brokers that have been in news reports recently, that say that other independent researchers are potentially trading in this data?
Zuckerberg: Yes, we’re investigating that too.
Roose: Are you worried about the #DeleteFacebook campaign that’s been going around? Have you seen meaningful numbers of people deleting their accounts, and are you worried that will be a trend?
Zuckerberg: I don’t think we’ve seen a meaningful number of people act on that, but, you know, it’s not good. I think it’s a clear signal that this is a major trust issue for people, and I understand that. And whether people delete their app over it or just don’t feel good about using Facebook, that’s a big issue that I think we have a responsibility to rectify.
Frenkel: We’re now heading into the 2018 midterms. Could you speak about what Facebook is going to do ahead of the 2018 midterms to make people feel more confident that the platform won’t be used this way again?
Zuckerberg: This is an incredibly important point. There’s no doubt that in 2016, there were a number of issues including foreign interference and false news that we did not have as much of a handle on as we feel a responsibility to for our community.
Now, the good news here is that these problems aren’t necessarily rocket science. They’re hard, but they’re things that if you invest and work on making it harder for adversaries to do what they’re trying to do, you can really reduce the amount of false news, make it harder for foreign governments to interfere.
One of the things that gives me confidence is that we’ve seen a number of elections at this point where this has gone a lot better. In the months after the 2016 election, there was the French election. The new A.I. tools we built after the 2016 elections found, I think, more than 30,000 fake accounts that we believe were linked to Russian sources who were trying to do the same kind of tactics they did in the U.S. in the 2016 election. We were able to disable them and prevent that from happening on a large scale in France.
In last year, in 2017 with the special election in Alabama, we deployed some new A.I. tools to identify fake accounts and false news, and we found a significant number of Macedonian accounts that were trying to spread false news, and were able to eliminate those. And that, actually, is something I haven’t talked about publicly before, so you’re the first people I’m telling about that.
I feel a lot better about the systems now. At the same time, I think Russia and other governments are going to get more sophisticated in what they do, too. So we need to make sure that we up our game. This is a massive focus for us to make sure we’re dialed in for not only the 2018 elections in the U.S., but the Indian elections, the Brazilian elections, and a number of other elections that are going on this year that are really important.
Frenkel: The Times reported that [Facebook chief security officer Alex] Stamos will be leaving toward the end of this year. Is there a broader plan for how Facebook is going to structure security on its platform ahead of all these important elections?
Zuckerberg: Sure. One of the important things we’ve done is, we want to unify all of our security efforts. And you reported on a reorg around Alex Stamos, and I’ll say something about him in a second. He’s been a very valuable contributor here and was a really central figure in helping us identify the foreign interference with Russia. And I think he has done very good work, and I’m hopeful he’ll be engaged for a while here on that.
One of the big things we needed to do is coordinate our efforts a lot better across the whole company. It’s not all A.I., right? There’s certainly a lot that A.I. can do, we can train classifiers to identify content, but most of what we do is identify things that people should look at. So we’re going to double the amount of people working on security this year. We’ll have more than 20,000 people working on security and community operations by the end of the year, I think we have about 15,000 now. So it’s really the technical systems we have working with the people in our operations functions that make the biggest deal.
The last thing I’d add on this. Take things like false news. You know, a lot of it is really spam, if you think about it. It’s the same people who might have been sending you Viagra emails in the ’90s, now they’re trying to come up with sensational content and push it into Facebook and other apps in order to get you to click on it and see ads. There are some pretty basic policy decisions we’ve made, like O.K., if you’re anywhere close to being a fake news site, you can’t put Facebook ads on your site, right? So then suddenly, it becomes harder for them to make money. If you make it hard enough for them to make money, they just kind of go and do something else.
Roose: Is the basic economic model of Facebook, in which users provide data that Facebook uses to help advertisers and developers to better target potential customers and users — do you feel like that works, given what we now know about the risks?
Zuckerberg: Yeah, so this is a really important question. The thing about the ad model that is really important that aligns with our mission is that — our mission is to build a community for everyone in the world and to bring the world closer together. And a really important part of that is making a service that people can afford. A lot of the people, once you get past the first billion people, can’t afford to pay a lot. Therefore, having it be free and have a business model that is ad-supported ends up being really important and aligned.
Now, over time, might there be ways for people who can afford it to pay a different way? That’s certainly something we’ve thought about over time. But I don’t think the ad model is going to go away, because I think fundamentally, it’s important to have a service like this that everyone in the world can use, and the only way to do that is to have it be very cheap or free.
Roose: Adam Mosseri, Facebook’s head of News Feed, recently said he had lost some sleep over Facebook’s role in the violence in Myanmar. You’ve said you’re “outraged” about what happened with Cambridge Analytica, but when you think about the many things that are happening with Facebook all over the world, are you losing any sleep? Do you feel any guilt about the role Facebook is playing in the world?
Zuckerberg: That’s a good question. I think, you know, we’re doing something here which is unprecedented, in terms of building a community for people all over the world to be able to share what matters to them, and connect across boundaries. I think what we’re seeing is, there are new challenges that I don’t think anyone had anticipated before.
If you had asked me, when I got started with Facebook, if one of the central things I’d need to work on now is preventing governments from interfering in each other’s elections, there’s no way I thought that’s what I’d be doing, if we talked in 2004 in my dorm room.
I don’t know that it’s possible to know every issue that you’re going to face down the road. But we have a real responsibility to take all these issues seriously as they come up, and work with experts and people around the world to make assured we solve them, and do a good job for our community.
It’s certainly true that, over the course of Facebook, I’ve made all kinds of different mistakes, whether that’s technical mistakes or business mistakes or hiring mistakes. We’ve launched product after product that didn’t work. I spend most of my time looking forward, trying to figure out how to solve the issues that people are having today, because I think that’s what people in our community would want.