by Jo Ling Kent and
Michael Cappetta / Sep.03.2018 / 6:38 PM ET
"We know we have to be ready for anything that happens, and that's why we've been building a physical war room," said Facebook's head of civic engagement.
by Jo Ling Kent and Michael Cappetta /
SEP.03.201802:01
Inside Facebook’s Menlo Park, California, headquarters, it is
all hands on deck to fight one of the toughest battles the social media giant
has ever faced. The mission is to protect Facebook's and Instagram’s billions
of users from more foreign interference in their news feed during the upcoming
midterm elections. During the 2016 presidential election, over 126 million Americans were exposed to incendiary
posts from Russia-linked accounts, pages, and ads.
Samidh Chakrabarti is the product manager of Civic Engagement at
Facebook. He is leading company-wide efforts to secure Facebook’s platform for
elections around the world, including the upcoming U.S. midterm elections. It’s
a directive that comes from CEO and founder Mark Zuckerberg, who has thus far
declined to say how much foreign interference is currently present on Facebook.
With less than two months to go, Chakrabarti said Facebook is
“much more effective than we used to be” and the entire company is “laser
focused on getting it right.” He also revealed new details on Facebook’s plans
to build a physical "war room" to coordinate a real-time response to
nefarious activity during the midterms.
Samidh Chakrabarti,
product manager of Civic Engagement, center, during a meeting at Facebook
headquarters in Menlo Park, California on Aug. 30, 2018. Elijah Nouvelage / for NBC News
NBC News Business Correspondent Jo Ling Kent interviewed
Chakrabarti exclusively for a story on NBC Nightly News with Lester Holt. Here
is a condensed transcript of the interview with Chakrabarti, edited for length
and clarity:
JO LING KENT: We're just two months away from the midterm
elections. Former [Facebook] chief security officer, Alex Stamos, has come
forward saying that it is actually too late to protect the U.S. midterms.
What's your response to that?
SAMIDH CHAKRABARTI: Well, we've been at this for a while. So
we've been working on this for quite a while. And we've been mobilizing a huge
cross-company effort. Really every single corner of this company has mobilized
to making sure that we're laser-focused and that we get it right. And so our
efforts are really focused on four main areas. The first is around combating
foreign interference, the second is around blocking and removing fake accounts
on our platform, the third is around slowing the spread of misinformation and
fake news on the platform, and then the last area is around bringing an
unprecedented level of transparency to political ads on our platform.
KENT: What do you say to an individual out there who is
concerned about what's happening in the lead-up to the midterm elections, but
they also go to Facebook every single day to consume and to share?
CHAKRABARTI: I'd say we're working around the clock to make sure
that our platform is secure during times of elections. And, you know, we've
been making a lot of progress. As an example, in a six-month period, we
actually detected, blocked, and removed, over a billion fake accounts before they
could do anything like spread fake news or misinformation. So that's one
example of the rapid progress that we've been making in this area.
KENT: What are some of the tactics and strategies that you're
using to make sure people can be safe on Facebook and get information that's
true and factual?
CHAKRABARTI: One example from how we're combating foreign
interference is that we basically have some of the best intelligence analysts
from around the world working to discover coordinated influence operations on
our platform. And we’ve made many announcements in the last several weeks about
how we've been removing tens of thousands of fake pages, fake profiles, groups,
and so forth that have been trying to engage in coordinated operations. So
we're just getting better and faster at detecting bad actors on the platform
and removing them.
Samidh
Chakrabarti, left, product manager of Civic Engagement, sits in the "war
room" during a meeting at Facebook headquarters in Menlo Park, California,
on Aug. 30, 2018.
Chakrabarti revealed new details on Facebook's plans to
coordinate a real-time response to nefarious activity during the midterms.Elijah Nouvelage / for NBC News
KENT: How are the "bad actors" now in 2018 different
than they were in 2015, '16, and ’17?
CHAKRABARTI: We're looking all around the world to make sure
that this same kind of playbook that bad actors used in 2016 is not used again.
We are staying ever vigilant, looking everywhere for this kind of activity as a
means of stopping it. And I think a lot of the work that we're doing, for
example, around political ad transparency, helps prevent this kind of behavior.
With political ad transparency, we're making it so that anybody who sees a
political ad on Facebook can see who is behind that ad and who paid for that
ad. I think that helps create a much more trustworthy environment for political
discourse on our platforms. …It is an arms race. And we're always working to
try to stay one step ahead. As an example, the bad actors out there have gotten
more sophisticated. They're better at hiding their location... So we also have
gotten better at finding when people are trying to hide their location.
KENT: President Donald Trump has come out criticizing big
technology companies, Facebook included. He said, "Facebook, they're
really treading on very, very troubled territory. And they have to be
careful." The president went on to say, "It's not fair to large
portions of the population." How do you interpret a statement like that
from the president of the United States criticizing Facebook?
CHAKRABARTI: The way that I think about it is I go back to what
is the mission of Facebook? You know, we really do want to give everybody a
voice and bring the world closer together. And so what that means to us is that
we want to make sure our platform is a great place for people to express
themselves across the entire political spectrum. And we work to make sure that
our platforms can support that, because if they don't, then we're not actually
going to be able to realize our mission of bringing the world closer together.
We want all voices to be represented.
KENT: Is Facebook discriminating against conservatives? There's
a lot of concern on that front right now.
CHAKRABARTI: I think we just try to make sure that the platform is
a place that's agnostic of people's political views. And really there is a
thriving conversation taking place on Facebook. Any part of the political
spectrum that you look at, there are thriving conversations taking place on
Facebook.
KENT: How did you detect that activity coming out of Russia and
Iran? And how quickly were you able to bring it down?
CHAKRABARTI: I think one important thing to understand is that
we're not doing this alone. We're not doing this by ourselves. We're just one
small part of a much bigger puzzle. We've been working with governments around
the world, with security experts around the world, with civic society around
the world to share information about threats that we see. And we bring those
together and we put our best intelligence investigators on it to find that kind
of activity on our platform and take it down. So, it's only by working with
other people that we can solve these kinds of problems.
KENT: What is it like inside headquarters as you're detecting
this activity? You're deploying so much effort, yet forces continue to try to
violate the community standards.
CHAKRABARTI: I think it's just the reality of work in this
space. This is an area where there are determined and sophisticated
adversaries, who are always going to try to circumvent measures that we put in
place. And that's precisely why we've made such massive investment in this
space. You know, we've really grown our safety and security team from 10,000
people a year ago to 20,000 people today. And so that is the kind of commitment
that we're showing to this. And it's really to the point that we've even said
before, that it's going to impact our profitability, because we take our
responsibility so seriously that we're willing to make that level of
investment.
KENT: When you add 10,000 people to the security team, what
exactly are their roles? What are they doing?
CHAKRABARTI: They're doing a lot of things. This is a huge
cross-company effort that requires people of many different disciplines coming
together to solve these kinds of problems. And so we have people who are
trained intelligence investigators. We also have people who are some of the
best computational data scientists in the world, who can find needles in a
haystack using advanced artificial intelligence. And so those are the kinds of
roles that we all put together in order to do this, because, really, that's
what it takes.
KENT: How much do you work with your counterparts at Twitter and
Google and other platforms to coordinate a response to fight bad actors?
CHAKRABARTI: As an example, with the takedowns that we did just
a few weeks ago, we've been working with our industry partners on this,
exchanging information. And that has really yielded a lot of benefits. The
benefit that we see is we are able to get more information about particular bad
actors and then we're able to take them off of the platform. And we can
similarly, reciprocally, provide that kind of help to others in the industry.
"I think we are in a much better place than we were in
2016. But it is an arms race," said Facebook's head of civic engagement.
KENT: Facebook is building a war room, a "situation room,”
a rapid response team of sorts in the final weeks leading up to the election.
What is that going to look like? Why are you doing that?
CHAKRABARTI: We have many measures that we've put in place to
try to prevent problems: the political ad transparency, blocking fake accounts,
combating foreign interference, and preventing the spread of misinformation.
But we know we have to be ready for anything that happens. And so that's why we've
been building this war room, a physical war room [with] people across the
company, of all different disciplines, who are there. So, as we discover
problems that may come up in the hours leading up to the election, we can take
quick and decisive action.
KENT: Is Facebook a safer platform now in 2018 compared to the
lead-up to the 2016 election?
CHAKRABARTI: I believe we've been making very rapid progress in
all of these areas: Combating foreign interference, making sure that we can
block and remove fake accounts, stopping the spread of misinformation and fake
news on the platform, and then also bringing more transparency to political
ads. I think we are in a much better place than we were in 2016. But it is an
arms race. And so that's why we're remaining ever vigilant, laser focused to
make sure that we can stay ahead of new problems that emerge. This is going to
be a never-ending process and that's exactly why we're investing so much in
both people and technology -- to be as prepared as possible for the midterms.
KENT: There has been a very vocal critic out there [former Google design ethicist and co-founder of the
Center for Humane Technology, Tristan Harris] who said that
2016 at Facebook left behind a living, breathing crime scene. What's your
response to that for 2018? Will it not be the case?
CHAKRABARTI: In 2016, we saw new kinds of threats that we hadn't
seen before. And that's why we've been mobilizing this huge effort across the
company. Every single corner of this company is just remaining laser focused
and taking our role really seriously. To make sure that this can be a safe
place for political discourse.
KENT: Do you sense that the efforts now that Facebook is
deploying to fight disinformation and fake news are working?
CHAKRABARTI: I think we're much more effective than we used to
be, in this regard.
No comments:
Post a Comment