• Transcript
  • Show Notes

Cyber Space is the newest podcast for members of CAFE Insider. Every other Friday, host John Carlin, the former head of the Justice Department’s National Security Division, will explore issues at the intersection of technology, policy, and law with leaders who’ve made an impact in the world of cybersecurity. 

In this episode, Monika Bickert, Facebook’s Head of Global Policy Management, joins Carlin for a deep dive on the dangers of cyberterrorism, the nuances of hate speech, the recent controversies surrounding political speech and misinformation on Facebook, and the company’s initiatives leading up to the November election. Bickert, a former federal prosecutor and cybersecurity expert, opens up about the complexities of Facebook’s community standards and how she, Mark Zuckerberg, and Sheryl Sandberg have navigated removing violent and misleading content.

REFERENCES & SUPPLEMENTAL MATERIALS

THE INTERVIEW: 

  • Bill Schulz, “Dove Hunting – Baited Fields Can Lead to Arrest,” AP News, 9/26/1987
  • Nellie Bowles & Michael H. Keller, “Video Games and Online Chats Are ‘Hunting Grounds’ for Sexual Predators,” The New York Times, 12/7/2019
  • Parmy Olson, “Murdoch’s MySpace Warns Users With Ad Campaign,” Forbes, 4/11/2006

CYBERTERRORISM

  • Zak Doffman, “Facebook Takes Action As Terrorists Found Using Platform To Recruit And Campaign,” Forbes, 7/25/2019
  • Davey Alba, Catie Edmondson & Mike Isaac, “Facebook Expands Definition of Terrorist Organizations to Limit Extremism,” The New York Times, 9/17/2019
  • Brian Fishman, “Crossroads: Counter-terrorism and the Internet,” Texas National Security Review, 2/2/2019
  • Statista data on the number of monthly active users on Facebook worldwide from 2008 to 2020
  • Facebook Community Standards
  • Davey Alba, “Facebook Bans Network With ‘Boogaloo’ Ties,” The New York Times, 6/30/2020
  • Deepa Seetharaman, “Facebook Removes QAnon Groups as It Expands Anti-Violence Policy,” The Wall Street Journal, 8/19/2020
  • Kim Lyons, Facebook deactivates almost 200 accounts linked to hate groups, The Verge, 6/7/2020
  • Karissa Bell, “Facebook just made a major change to how it polices content,” Mashable, 4/24/2018
  • Facebook blocks appeal form 

HATE SPEECH & CENSORSHIP

  • Facebook Community Standards: Hate Speech
  • Shirin Ghaffary, “Civil rights leaders are still fed up with Facebook over hate speech,” Vox, 7/7/2020
  • Elizabeth Dwoskin & Craig Timberg, “Facebook bans extremist leaders including Louis Farrakhan, Alex Jones, Milo Yiannopoulos for being ‘dangerous’,” The Washington Post, 5/2/2019
  • Timothy B. Lee, “Facebook’s porn filter has trouble distinguishing real breasts from bronze ones,” The Washington Post, 9/25/2013
  • Lee Rowland, “Naked Statue Reveals One Thing: Facebook Censorship Needs Better Appeals Process,” ACLU, 9/25/2013
  • Luke Garratt, “Fondant farewell: Facebook bans cake decorator’s logo of topless mermaids – because of the figures’ nipples,” DailyMail.com, 3/19/2014

POLITICAL INFLUENCE

  • Facebook COO Sheryl Sandberg’s second update about their civil rights audit 
  • Mike Isaac, “Facebook’s Decisions Were ‘Setbacks for Civil Rights,’ Audit Finds,” The New York Times, 7/8/2020
  • Katie Paul & Munsif Vengattil, “Facebook removed seven million posts in second quarter for false coronavirus information,” Reuters, 8/11/2020
  • Cecilia Kang & Sheera Frenkel, “Facebook Removes Trump Campaign’s Misleading Coronavirus Video,” The New York Times, 8/5/2020
  • Salvador Rodriguez, “Facebook says it’s gotten a lot better at removing material about ISIS, al-Qaeda and similar groups,” CNBC, 11/8/2018
  • Kari Paul, “Facebook faces advertiser revolt over failure to address hate speech,” The Guardian, 6/22/2020

MISINFORMATION

  • Facebook’s 2020 Community Standards Enforcement Report
  • Mark Zuckerberg’s speech at Georgetown University, 10/17/2019
  • Cecilia Kang & Mike Isaac, “Defiant Zuckerberg Says Facebook Won’t Police Political Speech,” The New York Times, 10/17/2019
  • Sheera Frenkel, “Facebook to Remove Misinformation That Leads to Violence,” The New York Times, 7/18/2018
  • Rebecca Heilweil, “Twitter now labels misleading coronavirus tweets with a misleading label,” Vox, 5/11/2020
  • Kjetil Malkenes Hovland, “Facebook Backs Down on Censoring ‘Napalm Girl’ Photo,” The Wall Street Journal, 9/9/2016
  • Deepa Seetharaman, “Facebook Employees Pushed to Remove Trump’s Posts as Hate Speech,” The Wall Street Journal, 10/21/2016
  • Erin Kelly, “Republicans press social media giants on anti-conservative ‘bias’ that Dems call ‘nonsense’,” USA Today, 7/17/2018
  • Alex Hern & Julia Carrie Wong, “Facebook plans voter turnout push – but will not bar false claims from Trump,” The Guardian, 6/17/2020
  • Facebook announcing the launch of their voting information effort, 6/16/2020
  • Facebook’s COVID-19 Information Center

John Carlin:

From CAFE, this is Cyber Space. I’m your host, John Carlin. Every other Friday, I’ll be exploring issues at the intersection of tech, law and policy with guests who’ve made an impact in the world of cybersecurity. My guest this week is Monika Bickert. She’s the Vice President of Global Policy Management at Facebook. She decides what’s hate, what’s not, what’s terror and what can be posted. Prior to joining the company in 2012, Bickert spent her career in the justice department serving as a federal prosecutor in D.C. and Chicago and serving abroad in Thailand. In her role at Facebook, she’s in charge of the policies that govern types of content that can be shared on the platform. She also leads the company’s efforts on counter-terrorism. Facebook of course is always making headlines. So I’m thrilled to have Monica Bickert join me on this program. Welcome Monica, it’s great to have you on. When you were at Harvard Law School, what did you envision yourself doing?

Monika Bickert:

I really wanted to be in criminal law. I wanted to be doing trials. I wasn’t sure which part I wanted. And so I actually was interested in being either a federal defender or a federal prosecutor and I applied to both and I ended up becoming a federal prosecutor straight out of my judicial clerkship.

John Carlin:

And did you envision then that one day you would be working for a giant tech corporation?

Monika Bickert:

No. It was the furthest thing from my mind.

John Carlin:

And when you say furthest, are you interested in cyber issues? What do those look like someone graduating around the turn of the millennium?

Monika Bickert:

I was interested in safety. And it didn’t really matter to me what the specifics of it were but I liked the idea of being part of keeping people safe and I also really liked the idea of being in the courtroom doing that.

John Carlin:

And at that time safety meant physical safety. Yeah?

Monika Bickert:

That was definitely the focus. In fact, even when I became a prosecutor, when you think about the most common crimes that we were dealing with, a lot of them were either street offenses or even when they were financial offenses, they were much less about the cyber aspects of it. But of course now when you think about the work that federal prosecutors are doing, almost everything has some sort of nexus to cyber. It’s just how we live now. And it’s also how crimes are committed now. So the landscape’s changed.

John Carlin:

So when you first started out, you went to be… Was your first job as an assistant United States attorney, a federal prosecutor.

Monika Bickert:

Well, my first job was a judicial clerkship which was one year and then I went through the justice department honors program to the public integrity section. So I was a federal prosecutor but I was focused specifically on public corruption. And then I went to the Chicago, US Attorney’s Office.

John Carlin:

As I recall, did you do any details? So at that time around then I went through the honors program too but I did it a detail to a local US attorney’s offices to try cases.

Monika Bickert:

I did. I went to the D.C. US Attorney’s Office that I did the misdemeanor rotation there.

John Carlin:

That was home.

Monika Bickert:

Yes.

John Carlin:

Although I did domestic violence. What was your most significant misdemeanor case that you recall?

Monika Bickert:

Probably, there was a White House fence jumping case that I did back then that became sort of complicated for a misdemeanor. And I thought, gosh, this is the most complicated a misdemeanor can get but I was wrong because when I was a prosecutor in Chicago, every once in a while you have to cover the misdemeanor offenses that come in from federal parklands and that sort of thing. And I ended up having to cover a case that involved the illegal baiting of doves and which I learned all about and ended up having a misdemeanor trial which many of my colleagues came to watch me do where frozen dove bodies were presented to the court.

John Carlin:

Wait a second, baiting of doves? So this is to caption them as a pet?

Monika Bickert:

I guess. So it turns out that there are strict rules about when you are able to bait a field that will draw doves to it so that you can then shoot them.

John Carlin:

I see.

Monika Bickert:

And this field had been baited before the season opened and the fish and wildlife agents knew that they were working with the state officials and they knew that this field had been baited. And so they went there on the first day of the season to see who showed up to hunt it. And those people got the misdemeanor.

John Carlin:

Did the doves cry?

Monika Bickert:

I heard what it sounds like when doves cry.

John Carlin:

Oh my gosh. Yeah. I’m going to have you sing that as our outtake, you end up on a big switch. Was there a time when you were prosecuting and before you arrived at Facebook where you ended up encountering some of the issues that you ultimately confront in your current job?

Monika Bickert:

Absolutely. When I was in Chicago, I was very drawn to the crimes against children work and ended up overseeing crimes against children, case assignments and dispositions. And that involves already a lot of cases that were committed by people sharing images online, often they were using file servers or other ways to share mass quantity of illegal child exploitation imagery. Also though I had a case back then that involved an individual trying to lure minors by using a fake portraying himself as a young high school girl and getting some young boys to send him images of themselves. And then I also had a case that involves a ring of individuals who were connecting online and were committing sex crimes against children remotely over webcam for one another.

Monika Bickert:

There was actually… The bulk of that work of the child exploitation work involved the use of the internet in the commission of the offense. And at the time it didn’t really occur to me that there was anything that could be done to root out that sort of behavior from the platforms perspective. Now, of course we have many, many ways of doing that and if you look at the child exploitation imagery that we removed from Facebook, more than 99% of that, well over 99% of that, we find ourselves using technical tools. So the state of technology has just gotten better and the role that companies can play has become very clear.

John Carlin:

And before you ended up working on that type of issue at Facebook, there’s one other turn of your career where you go overseas and are working on unrelated issues through a program that was partly designed to try to have people in US embassies with law enforcement experience who could work on some of these new technology issues. Tell us a little bit about that.

Monika Bickert:

That was one of the most interesting jobs I can really imagine having. My late husband and I were both at the time prosecutors in Chicago, we had the opportunity to go over to Thailand. We were based in the US embassy there along with another federal prosecutor who actually also now works at Facebook. And the three of us were responsible not just in Thailand but in some other countries as well for everything from training local law enforcement and lawyers who were involved in criminal justice on both sides and then also working on things like extraditions rule of law developments. It was a wonderful experience because I got to really see how different systems work. For instance, if you look at the Lao legal system, there are elements of the criminal justice system that looks similar to ours but when you actually go up there and try to observe it, you realize how vastly different it is and what the expectations are.

Monika Bickert:

At least when I worked there, there was not an expectation if you were a criminal defendant that you would be present for your own trial or that you would have counsel or that the judge would even hear from witnesses, the judge might just read the police report. And so when we look at our system and then these other systems, it really helps build an appreciation for the importance of rights and due process and that’s something that I carry with me into the current job.

John Carlin:

Little did you know that you were preparing for the job that you would once have by seeing the different way people enforce these policies globally but in your current job at Facebook, you are the head of both product policy and counter terrorism. Tell me a little bit how do those overlap?

Monika Bickert:

When I came to Facebook, I was working just in a straight legal role and then I ended up taking over what most people call it, content policies that is the rules for what people can post on the site, what people can advertise on the site and so forth. And then around 2014, 2015, we saw a real increase by terrorists in trying to use social media for organized dissemination of propaganda. And of course, ISIS is the group that comes most immediately to mind although they certainly weren’t the only and haven’t been the only player there. Because of that, we decided that we needed to have a more coordinated effort across the company to tackle that.

Monika Bickert:

And what I mean is, in the past we’d have the policy people like my folks writing the rules and overseeing the enforcement of those rules and then you might have engineers who were working on ways of keeping recidivists that had been kicked off the site from coming back. You had the public policy teams, those folks who work on government relations, talking to governments about what issues were there but not necessarily relaying that to the policy team. And so we decided we’re going to have one head of counter terrorism and that was me to bring all those elements together.

John Carlin:

Just taking one step back, why did you join Facebook in the first place? I know you would love to being a prosecutor and working for the government, what made you decide to do the switch?

Monika Bickert:

Well, I was moving back to the US for personal reasons and I was approached about the… I said there was three of us who were over in Thailand who were prosecutors, one had moved back and had joined Facebook and he actually approached me about checking out Facebook. And I came here and interviewed. And it really struck me as an interesting place to work because it was a lot of the same issues that I was working on as a prosecutor. But suddenly when you work as a prosecutor, you’re often working very deeply on a small number of cases, comparatively small number of cases. And you get to know the people involved in those cases, both the witnesses, the defendants, perhaps you get to know them pretty, pretty well and you know a lot of details but it’s a small number of cases. And when I interviewed at Facebook, what I saw was the potential to have impact across a much larger segment of the world.

John Carlin:

How many? How big is Facebook now?

Monika Bickert:

We’re well over two billion users. Across all of our services I think we’re up around three billion.

John Carlin:

That’s pretty big.

Monika Bickert:

Yeah. So we knew… And now look, when I joined Facebook, this was back in 2012, we had something like 2000 employees. And I remember when we were at 800 million users and then I remember when we hit one billion and everybody was amazed internally that we had gotten that big. And of course now we’re a lot bigger than that. But as we were growing along the way, one of the focuses has remained, how do you have that impact of doing what you can to keep people safe and create a safe environment, an environment that really fosters safety for people who are all over the world in very different settings. And one of the things that’s been great is to see… If you want to build the best safety mechanisms for a social media platform, you need resources.

Monika Bickert:

And as we’ve grown as a company, part of what that’s meant is that we now have dedicated engineering teams. We literally have 35,000 people that are working on safety and security across the company. And that includes a lot of engineers who are helping to design systems to root out bad behavior before it becomes a problem. For instance, they take down millions of would be fake accounts every day before they have a chance to do anything on the platform. And so that’s been fun to watch as we’ve gotten bigger as just how those resources have led us to a place where we can have more of a safety impact.

John Carlin:

You’re talking about resources that are of stunning scale both in terms of the investments that you’re making in technology and also the number of people. And does that mean Facebook is too big that you’re in charge of these decisions or does it mean that other companies are too small because they don’t have the resources to tackle these issues?

Monika Bickert:

From a safety standpoint, there’s no question that our size helps keep people safer. And actually not just people on Facebook. As we’ve grown bigger, we’ve had the teams to build these tools and to learn from what we see on our platform and that we share these tools with the smaller companies. So actually all of social media benefits from what these bigger companies are developing and learning. That’s true in child safety, it’s true in terrorism. It’s true in other areas from Microsoft sharing photo DNA, which has allowed a lot of companies to remove child exploitation imagery, to our terror propaganda removal technology which we’ve shared with other companies and we actually have a database of images that help smaller companies remove the sort of content, you can’t build that sort of stuff if you don’t have the size and the resources that we have.

John Carlin:

Facebook has a policy around dangerous individuals and organizations and that can lead as you’re just describing to the closing of accounts. When was that policy formed? And can you walk us through a little bit how it’s changed?

Monika Bickert:

Sure. That policy was actually in existence before I came into the job. I came in, I think around 2013 is when I switched over to this current role and already the policies that were in place by a team of mostly non lawyers, by the way, there were already very strong policies in place, including one that said that we don’t allow organizations who commit acts of violence or have a mission of propagating violence. And so your standard terror organizations, including those who are on the US list but also some that at that point had not yet been listed like Boko Haram were already banned from our services because of the violence that they commit offline.

John Carlin:

And so that has evolved right over the years. So originally it focuses on terrorist organizations, some that are designated and then this year we had the designation of a violent US-based antigovernment network as a dangerous organization, it was banned from the platform that uses the term boogaloo but it’s distinct, I guess, from a broader Boogaloo Movement. Tell me a little bit, was that a shift, was it just an application of a policy and to new facts?

Monika Bickert:

I think it shows how you have to be willing to adapt to changing tactics. So historically when we thought about dangerous organizations, terror groups and hate groups, they tend to be very organized. There’s a central leadership structure when we talk about groups like ISIS, there’s actually very organized dissemination of propaganda. They have one language that they use and the terms mean one thing. When you look at some of the movements that we’re seeing now, they’re much more a morphous. Boogaloo for instance, that’s a term that’s used by many different actors to mean many different things.

Monika Bickert:

And when we started looking at them, there was no sort of head of Boogaloo, if you will. There’s no Boogaloo website with a statement of what their ideology is. And so what we instead had to do was look at the way that that term is being used and of course within the broader group of people who use that term, you do start to see groups and iconography and terms and common goals. And that’s really where we put our focus. So Brian Fisherman’s team, the dangerous organizations team ultimately got to the point where they were comfortable designating a group of actors who were using that term and that allowed us to end up taking broad action against this Boogaloo Movement. But we acknowledged there are certainly people who could use that term in ways that have nothing to do with violence or encouraging violence.

John Carlin:

Yeah. I find that interesting. And particularly my perspective is one to tackle these issues from within government and there was a stark divide between the legal tools that are available when you’re pursuing a terrorist group located overseas, a foreign terrorist group or international terrorist group, where the group itself can be designated and then you can ban support to that group or material support to the group versus the applicable laws for domestic terrorism. And there’s been a lot of discussion over whether that should change as a matter of federal law but it simplifies slightly.

John Carlin:

I think when it comes to international terrorism essentially you can ban the group and make it a crime to support the group. But for domestic terrorism, it’s conduct-based and there usually has to be some other crime. And obviously it’s very different to think about what the use of the federal government system to bring a prosecution and a private platform like Facebook and the way that it moderates speech. But I’m sure that there were internal debates and discussions as you moved closer to something domestic like you did here. Can you talk us through a little bit about the way you think about the difference between the two?

Monika Bickert:

Sure. And I should be clear this Boogaloo was not the first time that we had removed an organization that was operating domestically. We have for years designated organizations who commit violence within their own state or hate organizations that are focused on their own state. I mean, we have more than 250 organizations that we’ve designated white supremacist organizations, a number of them in the United States. And we treat them the same way we treat terror organizations which means three things. It means that they can’t have a presence on our services. So if we see a page or an account or somebody purporting to be somebody from one of these organizations or to represent that organization, we’d take it down. And then it also means we don’t allow praise and finally we don’t allow support. And so if you had somebody, for instance, posted on Facebook, “I love the KKK. They’re a great group.” That’s something that we would remove regardless of who posted that.

Monika Bickert:

And that’s something that we’ve done for years. So that part is not new. Now, what is interesting and this is again different as you point out from law enforcement, in some ways we have a lot less information, we just have basically open source like what’s out there publicly on what this group has committed when we’re looking at offline behavior. So if we want to look at who’s committed an act of violence, we’re not out there doing some sort of investigation out in the field, we’re looking to see when final determinations have been made publicly about who’s responsible. In that sense we’re very limited but then we also can look at people’s online behavior.

John Carlin:

Can I just pause you one second there?

Monika Bickert:

Sure.

John Carlin:

So if you were to look for an international terrorist group like ISIS, it will be designated, it might be through UN resolution. It might be through lists that are promulgated by the state department. There are places where you can officially look to see who’s designated as the violent group. If it’s the KKK or another group there isn’t coming from the United States government, there isn’t such an official list, what do you use?

Monika Bickert:

When it comes to terror groups or violence we look… The US list we use just as a matter of law. So those groups are not allowed on the platform. But beyond that, we look to see if groups have proclaimed a violent mission or engaged in some sort of documented act of violence. When it comes to hate organizations, it’s a little different, we’re looking at a combination of whether they have a hateful mission proclaimed. So for instance, you might have a group that says our mission is to make sure that people of this race or this religion reign supreme and that these others have no place in our society or to root out and kill members of this group, that sort of thing of course would qualify them.

Monika Bickert:

But we also look for organizations who use iconography. For instance, let’s say they use the swastika as their icon. There are other sorts of flags and logos and other things that have the same sort of effect under our policies or let’s say that they regularly use hate speech or calls for violence against a group of people based on their protected characteristics like race or religion or let’s say they’ve committed… Their leaders have engaged in hate crimes and been convicted of hate crimes. Those are all sorts of things along with their on platform speech that would lead to us banning them. And when I say on platform speech, I mean like them using hate speech on Facebook.

John Carlin:

As a matter of process are you the decider? Who makes the decision ultimately whether someone gets banned?

Monika Bickert:

Most of the time yes but if there’s something that is particularly difficult or nuance or I know that it’s going to be very high profile, I’ll definitely flag it up the leadership chain and these are the sorts of decisions that Mark Zuckerberg, our CEO and Sheryl Sandberg, our COO do want to be a part of. They want to make sure that they understand what we’re doing to keep people safe and these policies are an important part of that.

John Carlin:

Would you say in 90 something percent of the cases it stops with you. And then there… How many do you think go up that extra level on the chain?

Monika Bickert:

When it comes to content decisions overall, we’re making tens of thousands of decisions every day. So the vast majority are resolved before they get to me. When we are designated organizations, then they do come up to me and as a regular course of business, we would make our senior leadership of the company aware and answer any questions. And in unusual cases or cases that are really tricky, I would make sure to get their input first.

John Carlin:

And is there an appeal process to use our old system or used to the court system can someone say, “Hey, you got me. No, KKK is misunderstood. I want to be off the list.”

Monika Bickert:

We do have appeals. Now, historically we offered only account-level appeal. So this sort of thing, like my account’s been removed, they don’t think it’s fair. And then a few years ago we began offering appeals to everybody based on the post level. In other words, you post something on Facebook, we remove it, you have the right to appeal and have us take a second look. That’s one of the things that’s been tricky though during the time of COVID because we are operating with fewer content reviewers in the office right now, just because we need to make sure those reviewers are safe. So we started sending them home in March. Most of them are still not back in the office. And so now we have appeals available at the account level but for that post level, it’s much harder for us to offer that right now.

John Carlin:

One group that’s getting a lot of attention right now is Antifa. And there’s a big public debate about what that group even is and how much of a threat does it really pose and then different but also there are gangs like MS-13, could you just walk through what are your current policies with those two groups and how do you tackle those issues?

Monika Bickert:

Sure. We don’t allow any groups that are perpetrating violence. And we list this actually in our community standards where we say, if there are criminal gangs or other criminal organizations involved in these crimes and there’s a list of crimes then we’ll remove them from the site. And sometimes we will get… People will report to us those pages or groups, other times we will throw our own technical tools to find them ourselves. And that includes groups like Antifa. If we see, for instance, an event that is organized where they’re encouraging people to engage in violence, that’s something we would remove. We also would remove events or other activity if it’s something that’s organized for a hateful purpose. So violence and hate are really the cornerstone of why we’re going after groups like this.

John Carlin:

And we’ve talked a lot right now about groups and switch the focus a little bit to individuals. So banning high profile personalities, I know you’ve put out in a statement that you always banned individuals or organizations that promote or engage in violence and hate but it seems like there’s been an increase in taking actions against certain individuals and individuals who have high profiles including Nation of Islam leader, Louis Farrakhan, the conspiracy theorist, Alex Jones, Milo Yiannopoulos and others, is there a switch in the way that you’re handling individuals?

Monika Bickert:

No, not in terms of the policy, although there has been an increase in the resources over the years that we’ve allocated to this. So we now have… I told you when we started having that cross functional group focused on terrorism prevention efforts across the company a few years ago, well, we now have more than 350 people who were working to counter the use of our platform by terror organizations and hate organizations. So we just have a lot more people that are focused on this now. We’re also more public than we used to be.

John Carlin:

Can I push you a little bit just on the number side and then switch a bit because I think you had said you had around 30,000, 35,000 focused on public safety and security issues and of those though only around 150 to 300 of them are focused on policing platform content?

Monika Bickert:

No, let me be clear. So we’ve got more than 35,000 people who are focused on safety and security across the company, that includes everything from our content reviewers. These are people who as stuff gets reported, it comes into them, they look at it and assess it against all of our policies and remove it. That’s more than 15,000 people.

John Carlin:

That’s bigger than the FBI, by the way, just for those hung at home. Well, go on.

Monika Bickert:

That also includes our engineers who are working on building the tools to find the stuff. It includes some lawyers who are assessing whether or not something needs to be referred proactively to law enforcement. And then it also includes some specialists. And within that group of specialists, we have that dangerous organization’s team and that includes more than 350 people that are focused on understanding the way that hate organizations and terror organizations work. They also maintain close relationships with experts out in the field including academics and relevant government organizations to make sure that they are initiating that designation process and removing those groups.

John Carlin:

And you recently announced that you removed some 40% more terrorism content in the second quarter than in the first, you were just saying a little bit that it’s maybe resolvable technology and resources but-

Monika Bickert:

A lot of that is technology. So every six months and now we’re moving to quarterly, we put out a report that shows how much content we removed from the site in certain categories and that includes terrorism but also things like bullying and harassment or hate speech or fake accounts. And then we say, here’s how much of it we found ourselves before people reported it to us. And with terrorist content, our numbers have jumped in terms of what we’re removing. We remain at a very high proactive rate meaning that well, over 99% of what we remove, we find before anybody reports it to us, that is the majority of that content is propaganda. And the sort of posts that technology is good at catching. Technology’s good at catching things like exact matches or near matches of content that is being shared for wide distribution. It’s much harder for us to catch things that are being created in an individual way that are new posts. With terrorism, most of it is the former.

John Carlin:

Because I remember you saying and you formed that you weren’t sure technology would ever be a good fix here because unlike for instance child pornography where you can create an identical image, a library of images, once there’s a bad image out of there and it’s a bit forbid identical image and then you can search for it and block it, that speech was so much more diffuse that there might not be a technological solution but it sounds like you’ve made the technical progress on that. Could you walk us through how you handle that challenge?

Monika Bickert:

Yes. Now, I would still say that speech and texts is trickier because of the context. With imagery, and a lot of terror propaganda now is imagery, often we’re not using fancy things like artificial intelligence, we’re using image matching software to catch the most commonly shared known terror propaganda images. When we look at text, it has been very hard to build systems that can find words that an individual is using together in ways that we haven’t seen before but we have gotten a lot better at that. So if you look at hate speech, for instance, I remember telling people… Three or four years ago, I would say technology, it’s not really any good against hate speech and I don’t think it’s really going to be. And now 95% of the content that we are removing for violating our hate speech policy, so that’s a lot of tech stuff, we are finding with our technology before people report it to us.

Monika Bickert:

But I will say most of that still has to go over to real people to look at it and have them assess it before we remove it because you can have somebody using say a racial slur to attack somebody but you can also have somebody using that same slur and saying, “Boy, somebody called me this today, I think it’s so horrible. We should never use this word.” And our systems might not know the difference and so we flag that kind of content. It gets sent over to our reviewers who then make that assessment.

John Carlin:

That was one and so does that mean in every instance before you make a final determination humanize take a look at what the algorithm is spotted?

Monika Bickert:

No. We sometimes are able to use technology to make the full decision and that would certainly include things like links to say known pornography sites or matches with an ISIS terror propaganda video. We don’t need a person to look at that. Now with hate speech, we often do but there are certain times that hate speech could be, say, a meme that has a caption across an image that is hateful and our systems can recognize that and just remove it.

John Carlin:

And distinguish between when we talk about 40% increase on removal of terrorism content, is that different than hate?

Monika Bickert:

Yes, it is.

John Carlin:

Organization or hate speech?

Monika Bickert:

For purposes of that report and I’m generalizing a little bit here but mostly when we talk about terror content, we’re talking about propaganda for terror organizations. And when we’re talking about hate speech, it’s a person making a comment that attacks or degrades other people or a group of people based on say their race or religion or gender or nationality.

John Carlin:

So who wins in the case of a tie? Just curious. So let’s say it’s the KKK and they both are a group that advocates violence in some circumstances and they also have racist writing imagery. Are they hate terrorism? Do they double count?

Monika Bickert:

If we’re designating a group, a terror organization or a hate organization, there’s no functional difference. They can’t be on the sites and we can’t… I’m sorry, they can’t be on the site and we don’t allow anybody to praise or support them. So it really doesn’t matter. But if an organization has engaged in violence, they will be marked as a violent organization.

John Carlin:

And one thing just getting into the details of it a little bit that I found fascinating that you have a list of prohibited language. So for instance, comparing a group of people to insects along with protected characteristics like physical appearance to terror users what isn’t allowed. And as you’re saying, as you’re describing, I can see why it would be difficult for reviewers to otherwise identify hate speech. But it’s just interesting that there’s this list of explicit examples, who comes up with the list?

Monika Bickert:

John, it’s a fascinating process. I have learned so much from doing this over the past, whatever it’s been seven years, eight years. When you think about writing the laws, those laws can be in regulations. They can be implemented by fact-finders who get to talk to witnesses and read reports and get all kinds of contexts. Our reviewers do not have that benefit. We have more than a million pieces of content a day reported to us and then we’re also finding additional content with our technical tools. And so our reviewers basically just have the context that is contained in that post or on that page or whatever has been sent to them for review. They don’t get to go talk to people. And so we have to write review guidance for them that is incredibly granular. We also have to make sure that we’re getting consistent decisions whether or not the reviewer is seated in Germany or India or Texas or wherever. And so there’s no room for subjectivity.

Monika Bickert:

And so that’s why when you look at our community standards and most of this is just out there public for people to see. When we say hate speech, it doesn’t say, if you say something really mean about somebody based on their race or religion, it says, if you say any of the following types of attacks and then it goes into things like you were describing, for instance, comparing people to scum or filth or comparing them to insects. We develop that list, those definitions through a lot of engagement with experts outside the company. And then of course, we look at research as well about how terms are being used on our services.

Monika Bickert:

But a lot of this is through relationships with literally hundreds of organizations who are focused on these types of issues and they can tell us what they’re seeing online and what their constituents are worried about. And then we’ll assess that on our services. When it comes to things like comparing people to insects that’s historically, if you look at genocides and violence against minority groups, often that’s the sort of language that you see before that, comparing people to rats or to cockroaches. And so that’s something we don’t allow.

John Carlin:

It’s a rough job to be the person doing the content moderation, just looking at all of the content that that people think is terrible in the first place along with images. I remember back when we were at justice and Attorney General Ashcroft had been appointed and wanted to set up an obscenity taskforce.

Monika Bickert:

I remember that too.

John Carlin:

And I talked to folks at the FBI at the time and no one wanted to be on the FBI team. They had to go look at the images. It was terrible. And it caused mental health issues and people even have that issue sometimes when they are in not just obscene, which is even worse in some ways because it’s a combination of everything terrible but also child pornography in and of itself is also terrible to look at. I noticed that you ended up in a settlement with some of the content, you’ve been in Facebook with content moderators that included providing mental health services, how do you deal with helping people whose job it is to look at the worst of the worst?

Monika Bickert:

That’s something that we’ve been focusing on since I came to the company and I do draw on my government experience so as you do in thinking about this. When I was a prosecutor, I worked on a lot of the sex offenses against children cases. And those images are so tough to look at that as you know the law enforcement officials who do it have to go through checks to make sure that they’re still okay. And so when I came to Facebook, one of the things I wanted to make sure that we had in place was counseling and resources. And then also, and this has been an important part of our program, training managers to spot when people are not doing well because they may not spot it themselves. And so now if you look at the way that we provide support to our content reviewers, they all have access to counseling and that’s onsite.

Monika Bickert:

It’s also offsite. We have a 24 hour access to counseling for them. Also when they’re going through the job training and this is really important, we need to make sure they understand the kind of work that they might be doing. And of course we use technology to try to minimize the times that our reviewers have to look at upsetting graphic content but removing it is fundamental to the work that we do and so those work does need to be done. So we try to give people a lot of clarity around the work that they’ll be doing so they can make sure that this is something that they’re actually comfortable doing. And then if they’re not, we provide pathways for them to get into different types of content review. In other words, you can say, I don’t want to do this. I don’t want to work on graphic imagery and you can be reassigned.

Monika Bickert:

So those are the sorts of things that we’ve worked on and improved over the years and offer now at our sites globally around the world. John, this is also an interesting issue for us around you asked earlier about appeals. And I said that we had sent our content reviewers home. A lot of the work that we need people to do is stuff that really can only be reviewed in the office either for privacy reasons or because the content is particularly sensitive. And so it’s been a challenge for us and I know for other social media companies during the time of COVID to make sure that our reviewers are safe and that’s been our top priority and that’s why we sent many of them home. But it also means that it is harder for us to review that content. And so it’s meant for us leveraging more technology, reducing the amount of appeals that are available, there’ve been some tough decisions and trade offs there.

John Carlin:

Look, because we talk about it inevitably in particularly as you move more towards political speech than speech around violence or terrorism, it becomes more subjective, what’s the worst mistake you’ve made or an example of a mistake that gets reversed on appeal where people are trying to apply standards but end up blocking something that really doesn’t meet your rules.

Monika Bickert:

I hate to be boring but I’m not sure I could pick a worst. Honestly, it’s painful every time. We have both situations where we’ve removed something and then we look silly for having removed it. One example that comes to mind, this is not particularly serious by the way… This is years ago, somebody shared a picture of an image of a cake that they had made and our system recognized it as nudity and removed it. And there were articles making fun of us for that or there was a bronze statue in a park and the ACLU… This is actually how I met some of my counterparts of the ACLU. The ACLU shared an image of the statue in this park and our technology recognized it as a new person in the park and removed it.

Monika Bickert:

So that was a fun conversation for me. I was pretty new in the job then. So we have things like that with false positive where we remove something that we shouldn’t have. And then of course we have things that we miss and the most painful ones for me are when people write in and they say things like this was a death threat against me or this person called me this really horrible thing and this is the message I got back from Facebook. And then it’s a message from us saying, “Thanks for reporting this. It doesn’t go against our community standards.” And it’s clearly just a mistake but that is something that’s painful every time. I do think we’ve gotten a lot better over the years and that’s in large part because we’ve hired so many more people and because our technical systems have gotten better too but at our scale, we are always going to make mistakes and that’s hard.

John Carlin:

I know you’ve gotten additional since you started to now, both the company has grown but also you’ve gotten a lot of additional resources to try to combat these issues and partly that’s a result of a lot of outside pressure and there’s been particularly pressure lately around hate speech and some of the more complicated issues that we’ve been discussing and whether or not Facebook’s doing enough to enforce its policies and also whether the policies themselves should be more robust. I know the company self commissioned civil rights audit and tell me a little bit about what your takeaways were from the findings from that civil rights audit and what you thought of that process.

Monika Bickert:

I learned a lot from it and it was certainly we’re doing… And I’m so glad that you said, I know you self commissioned that because it’s funny but in all their headlines, a lot of people have sort of missed the fact that this was something that we undertook voluntarily because we wanted to make sure that we were doing all we could in this area. And so did the report have some recommendations on how we could improve? Absolutely, but that was sort of the point. And this has been the past two years that we’ve been working with the auditors. And so along the way, my team and other teams at Facebook have been having conversations with them and implementing things that they’ve said. So there have been incremental improvements along the way. And it just a couple of those, we’ve had a policy in place for years where we’ve removed content that could suppress the vote by misleading people about how or where they could vote, just sort of the basics.

Monika Bickert:

And by working with the civil rights auditors, we’ve expanded the sort of content that we’ll remove. We now, for instance, remove content that threatens that people will be arrested if they show up to vote or deported if they show up to vote. We remove content that encourages people to bring weapons to polling places. We applied our voter suppression policies to census participation. So if people were trying to mislead people about how to participate in the census, we would remove that as well. And then they also looked at our enforcement and they had concerns about hate speech false positives.

Monika Bickert:

What I mean by that is, let’s say it’s an activist who’s saying this is a slur that is being used to target our community and they post that on Facebook and we remove it then that really hurts that activist and her community. And so we have been working on a series of pilots to combat those sorts of hate speech false positives. And then we’ve also expanded our policies in other areas like around white nationalism. That was an announcement that we did over a year ago but it was very much in partnership with the civil rights auditors. So a lot of the value in doing an audit like this is in those conversations that happen along the way.

John Carlin:

That speech false positive is particularly interesting. That seems like a difficult one to solve with tech and just requires more additional human resources or is there some way to modify a technical solution so that you can include both context and content?

Monika Bickert:

The technology has gotten better. And our engineers especially using artificial intelligence have developed classifiers that are pretty good at understanding some of the context but at the end of the day we really do need people to look at a lot of this but even the people will make mistakes. I mean, sometimes and I remember doing this as a company, I remember us making a mistake early on in the terrorism setting where one of our terrorism experts that we talk to outside of the company called me up and said, “You guys just removed the biggest anti ISIS site in a country in the Middle East.” And indeed our reviewers had looked at it and they were seeing all this ISIS propaganda and there were captions underneath it explaining why this was wrong and what people should be doing to fight against it but to the reviewer… The reviewer opened it up, saw that it was filled with ISIS propaganda and removed it.

Monika Bickert:

And so in the area of hate speech, sometimes understanding what the captions are and how language is meant to be used could be really tricky. For instance, somebody shares a video let’s say and in the video somebody uses a racial slur. It’s pretty easy if the caption says, “This is horrific, watch this.” But what if somebody doesn’t share it with a caption and what if it’s a news media organization but they share it and it just says, “This was a speech that was given.” And there’s no more context, that’s the sort of area where it gets really hard and tricky for us and for our content reviewers.

John Carlin:

And is that the goal of… I noticed you’re funding an independent or what’s been described as independent external oversight board. Are those the type of more difficult decisions where the board will make a call about whether content should or should not be removed kind of on external appeals?

Monika Bickert:

Yes, it is. And it’s been described as independent because it is independent. Basically what it is is it’s a group of… Now Facebook does set it up and fund it but we’re funding it through an independent trust. And this is a group of right now, 20 oversight board members. They will add more to their ranks but right now there are 20 of them and they are from around the world and-

John Carlin:

Who selects them?

Monika Bickert:

There are co-chairs of the oversight board and they along with Facebook input, will select these members. Once the members are in place, we cannot override their decisions and we can not terminate them or somehow get rid of them from the board because we don’t like their decisions. So they can make the decisions they want on the pieces of content that are sent to them and those pieces of content users can consent content to them.

Monika Bickert:

So for instance, we remove your post, you appeal to us. We say, well, we’re still going to keep it down, we think it violates, you can then appeal to the board directly. And they have the discretion to hear your appeal. The other thing that we can do which I’m personally excited about is Facebook can send content to the board. So let’s say we have a decision that’s particularly controversial, we could actually send that to the oversight board and say we think that in this case you and not us should make that decision. Whatever decision the oversight board makes they have to make publicly. In fact, they’ll issue a decision that explains their thinking and then we have to implement it. And we’ll also respond to it publicly.

John Carlin:

This oversight board did get criticized by members of the House. They were concerned that the board and its members might be ill-equipped or ill-empowered to meaningfully improve the behavior of the company and they use a provocative phrase in this letter saying the board may act simply as a smokescreen behind which Facebook executives will maintain ultimate control. What do you say in response to that letter?

Monika Bickert:

Well, we’ve been pretty clear that the board will have… The board decides, we have no input. The board decides whether or not they will hear a user’s appeal and they decide what happens to that piece of content. We don’t have the ability to make the final decision or to override that.

John Carlin:

Another push lately has been an advertiser boycott or a number of major brands said they wanted to pull their ad money from Facebook starting, I think, with North Face and then it just seemed like there was Avalanche of other companies joining on around in particular what they viewed as a proliferation of white supremacist content along with incitements of violence against protestors and widespread voter suppression efforts. Do you think that their criticisms were fair?

Monika Bickert:

If the goal of the boycott is to draw awareness to the issue of hate speech online and to try to stop it, we are fully on board with stopping hate online. There’s obviously nothing that we would like more than to be able to remove all of it from our services. Users don’t want it, advertisers don’t want it, we don’t want it, there are of course challenges in enforcing that policy for all the reasons that we’ve discussed but still if you look at the numbers, we remove more than any other company. The European Commission just put out a report as they do regularly on hate speech removals and we as a company performed the best in terms of removing hate speech from our services. And we also just put out our community standards enforcement report a week or so ago where these were our Q2 numbers.

Monika Bickert:

So these are from long before the boycott but this is April through June. And you can see that our hate speech numbers we’ve removed now in that quarter 22.5 million posts for violating our hate speech policies and 95% of those, we found ourselves before anybody reported it to us. So can you find hate speech online? You can. And will we have things that get through the system and we’re working hard to make sure that we don’t but sure. This is, we’re not going to be perfect here but we’ve made vast improvements and we’ll continue to.

John Carlin:

We talked about it a little bit already but on AI and machine learning and what the prospects are when it comes to hate speech where it becomes more complex, do you think we’ll continue to see progress? Have we reached the maximum at which technology can help and where you need humans? Where do you foresee us a couple of years out?

Monika Bickert:

I’ve learned a lot here because I used to be a skeptic on this but I’ve been converted by Guy Rosen and his wonderful team of engineers. And I think they’re making tons of progress in this area and they’re going to keep on doing it. And so what I mean by that is things that are textual posts where you really need a lot of context to understand, we are using reviewers to label things, to help inform our technology and the technology is just getting better and better. So it’s that combination that back and forth with people and with machines that I think is going to lead us to get a lot better.

John Carlin:

Part of the reason why you might be getting attention like the boycotts would be, it seems like Mark Zuckerberg has had to switch a view or struggling with some of these issues as well, in terms of what are you? What is Facebook? Are you a social media company? Are you a publishing company? And what should you prioritize? And for a period of time, it really sounded like he was prioritizing first amendment over some of these concerns about content and now has shifted. Is that a fair way to describe it?

Monika Bickert:

No. I think if I would unpack some of the narratives we’ve been seeing in the US a little bit, what occurs to me is this, over the past however many years I’ve been doing this job, most of the criticism that we get in the area of content policy is around enforcement. Meaning, everyone kind of agrees that the lines are where they should be, maybe you have little disagreements about what exactly is hate speech or what exactly is terrorism or bullying but everyone knows it’s bad and everybody pretty much agrees we should remove it. And then they’re upset and frustrated when we’re not good at removing it as a we by the way. But most of the media coverage that you’ve seen over the years has been about enforcement challenges. What’s different now is in the United States, people really disagree on where the line should be in the first place.

Monika Bickert:

And that includes on things like what private companies should do around misinformation or what sort of transparency should be added to advertisements or political pages. And literally on some of these issues, when you ask people and I do this from not really during the time of COVID but before COVID, I would literally be in a room of people and I would ask for people to raise their hands and say how many of you think that a company should make decisions to say about what is true and false and remove it and how many of you think they shouldn’t?

Monika Bickert:

And people are very split on these issues. And so what we’ve tried to do is be very public both through Mark but also through our posts that we put up in our newsroom. And through speaking that I do and others do, we’ve tried to be very public about what our values are and our basic values are, we want to be a place for expression unless it’s something that is going to make people unsafe. So we list expression, safety, dignity, privacy and authenticity as our values. And those are really the foundation for all of the rules that you and I have discussed in so much detail.

John Carlin:

Yeah, let’s move because this is a really fascinating difficult set of issues when it comes around labeling misinformation and also political speech and as you say, Mark Zuckerberg gave a speech at Georgetown and said, “I don’t think most people want to live in a world where you can only post things that tech companies judged to be a 100% true.”

Mark Zuckerberg:

And even when there is a common set of facts, different media outlets often tell very different stories, emphasizing different angles and aspects of the story. There’s a lot of nuance here. And while I certainly worry about an erosion of truth, I don’t think most people want to live in a world where you can only post things that tech companies judged to be a 100% true.

John Carlin:

Misinformation is very different than terrorist speech or hate speech and harder to regulate. And it seems like you started to move in this space on an approach not necessarily of taking it down but of using warning labels. And then just as you’re moving into this area, you have a particularly hard case study of COVID-19 and people putting out different information on the disease and what might be true or not true and how do you evaluate that. So tell me a little bit what the process is, how do you try to label, how do you deal with the fact that there’s such a hot debate right now including from the President of the United States over what’s and what’s not true?

Monika Bickert:

This has probably the most difficult area for us. And for one reason really, which is you can’t give people a set of rules in advance about what’s true and what’s false. You can define hate speech, people might not agree with your definition but you can say to your content reviewers and you can say to your users, here’s how we’re defining it. And this is what we’re going to remove. You can’t do that with what is true and what is false when there’s new information out there every day. And so it is by nature more difficult.

Monika Bickert:

What we have done and it’s been fascinating to be a part of this because no company has tried anything like this, we’re the first to do it and we have the most robust program. In early 2017, we started building a series of relationships with independently certified fact-checking organizations and we started out with a handful. We now have more than 70 such organizations around the world that we partner with. And what it means is, if people are reporting content as false or if our systems are detecting by the way that content is being shared that it might be false then we will send that information to these-

John Carlin:

Sorry, can you explain that?

Monika Bickert:

Sure.

John Carlin:

How would sharing information help you determine that it’s inaccurate?

Monika Bickert:

Let’s say that people are tending to share a certain article and then they are tending to remove it. And their friends are posting comments saying, “That’s a hoax, you goofball didn’t you know that?” That’s something that behavior of sharing it and then retracting might be something that our systems could recognize. So there are technical reasons but also based on user reports, we could send content to these fact checking organizations and then they also have the ability to just proactively say, we’re going to fact check this thing that’s on Facebook. If they rate something false or partly false, if they use one of our labels then we will apply that label as a screen that sits over the piece of content. People have to click through it to get to the content underneath it. And then we also link people offsite to whatever the fact checked information is so people can get the real story.

Monika Bickert:

And that’s something that we began but have expanded. There’s also several categories actually of misinformation that we will remove. We don’t just label it and reduce the sharing of it or reduce the distribution of it. We will actually just take it off the site. And that includes misinformation where there’s immediate safety risk. So let’s say it’s misinformation that could contribute to riots in Sri Lanka. This is an example of something that we’ve removed or it’s something that could contribute to the risk of somebody getting COVID-19. With COVID-19 specifically and then… Sorry, we also remove misinformation around voting and census suppression. So those are areas that other companies like for instance, Twitter is applying labels to some of that type of content. That’s content that we will actually remove from the site.

Monika Bickert:

And when we do that, we do that in partnership with safety organizations or with electoral officials or whatever World Health Organization, whatever organization has the information to allow us to make an accurate determination about whether something’s true or false. And COVID-19 has been really an interesting place to see these policies at work in the past few months, in the past three months, I believe, there have been 98 million posts that we have labeled and that’s with the third party fact checking label and reduce the distribution of. And then there have been an additional seven million posts that we’ve actually removed from the site for sharing what I would call harmful COVID misinformation. And that would be something where if somebody believed the misinformation, it could lead them to engage in dangerous behavior. So for instance, if you say there’s a cure for COVID, don’t worry about it or did you know that if you have blonde hair you’re immune and can’t get the virus, that sort of thing which we’ve seen around the world certainly not just in the United States, we will remove from the site.

John Carlin:

I was fascinated, there’s a quote from Guy Rosen. I’m always interested in labeling. It’s kind of like the teenagers if you label something adult content, I think they’re more likely to go try to watch that movie or see that video. But apparently I noticed a quote from Guy Rosen who said when people saw the warning labels on COVID-19 misinformation, 95% of the time they did not go on to view the original content. That seems a pretty successful.

Monika Bickert:

Yes, I would agree with that. Although, look, I also want to point out the other side which is that part of that is because you’re just putting friction into somebody’s experience. And so it’s a reason that we have to be so careful with how we do this. For instance, graphic content, we have labels that we will put over certain graphic content saying this is really sensitive and maybe upsetting. So only see this, if you want to click through. And that does really reduce the number of people who will see something. But if you talk to say an animal rights group, they’ll say, “Well, that’s terrible. If we really need people to see this content, we need them to take animal abuse seriously. And you’re keeping people from seeing it by putting friction into their experience.” So there’s two sides to that but yes, I do think the labels are proving to be fairly successful against the sharing of misinformation.

John Carlin:

What was the decision like in terms of a post from President Trump around COVID and labeling it as misleading?

Monika Bickert:

There was a video that we removed recently because it was a post from President Trump and it contained a statement from a doctor that said that children are basically immune from the virus. And we have in our policies, a list of things that we hold as contradicting information from the WHO and the CDC and that includes saying that groups of people are immune from the virus. So that policy applies to President Trump just as it does to anybody else. In fact, there’ve been a couple of other presidents who’ve had COVID related misinformation removed from their Facebook presence including the President of Brazil and the President of Madagascar.

John Carlin:

So in this instance you had a clearly defined standard that you could use, this was in contravention of the standard and gets removed. I know you also have the newsworthiness policy and as it relates to politicians and I was curious to push a little bit on how that plays out because that’s another one of these very difficult issues. Tell me what the newsworthiness policy is?

Monika Bickert:

The newsworthy policy and I should be clear this is really not at its heart about speech from politicians. It’s about making sure that if there’s speech by anybody on the site, that is about something important to public discourse that we’ll try and leave it up. If we can, we will still remove it if there’s a safety risk but we’ll try to leave it up if we can. We put that policy in place back in September of 2016 following the terror of war image being posted on Facebook. That’s the image of the little girl in Vietnam, she’s wearing no clothing running away from Napalm attack. We removed that image because perhaps not shockingly, we don’t allow people to post images of nude images of prepubescent minors.

Monika Bickert:

But then we did restore that image because we said, well, look here, we know that this woman consents the sharing of the image. It’s a very publicly known image. It’s also being used to raise awareness about atrocities. And so we left it on the site and we put in place a newsworthiness policy that said, we will try to consider the value to public discourse and leaving content up. When we apply that policy which we do regularly, it means we’ll leave content up that otherwise violates our policies. It is almost always content that is shared by people other than politicians, it’s museums with exhibits that involve nudity or it’s news organizations sharing graphic content about something that has happened in their area. There have been only a very small number of cases that have involved speech from politicians but there the standard would be the same. If there’s something where there’s a safety threat, we would still remove it.

John Carlin:

Well, let’s talk about one of the calls that you made that was a difficult one but back when President Trump had a Muslim ban video.

Donald J. Trump:

Donald J. Trump is calling for a total and complete shutdown of Muslims entering the United States until our country’s representatives can figure out what the hell is going on.

John Carlin:

It was not taken down and not viewed as hate speech. And I think you at that time explained that that’s because upon analysis, you viewed it as advocating for a policy position immigration as part of a newsworthy political debate. Is that the sort of decision that now would go to the outside board?

Monika Bickert:

It could. It certainly could. Either we could send it or people could appeal it directly. So we didn’t make a newsworthiness exception there although we know that the common was certainly newsworthy. I mean, it was one of the most talked about things in the country at that time but we did not find that to violate our hate speech policies and yes, people could appeal that to the oversight board.

John Carlin:

And we’re heading into an election. So in case you weren’t having enough fun being attacked by both sides, you’ve been attacked by including a recent hearings conservative groups for saying that you consistently have a liberal bias in making decisions on what to leave up or take down. We’re now heading into a 2020 election, I noticed you recently added a voting info label recently to President Trump’s posts that mailing voting may lead to a corrupt election. What led to that decision and what are you doing to prepare for your role in the 2020 election?

Monika Bickert:

Oh my gosh. Where do I start? This is something that occupies a good portion of our time right now. And we’ve focused on elections for years and there’s more than 100 elections around the world that we have to think about and make sure that we’re dealing with all of them but certainly the upcoming US presidential election is a focus for us. And we now have a team that is dedicated to working on all the different aspects of making sure that we’re doing our part to help the election went smoothly. And that includes things like our approach to misinformation and voter suppression which we just talked about, taking down fake accounts or influence operations which we can certainly talk more about. And this is maybe the most important thing, getting people accurate information about the election and what’s going on. And this is really going to be important in the US election because of COVID.

Monika Bickert:

Because with COVID, we’re going to see a lot more use of mailing ballots or absentee voting. We may see delayed election results. It’s just going to be a different kind of election. And people are really going to need to be able to get accurate information. So when you asked about the informative label that we put on the president’s post, that’s actually something that we’re doing across the site. Whenever we have posts where people are talking about voting, we’re directing them to a new center that we just published called Our Voting Information Center, you can find it. If you just go to the search function on Facebook or Instagram and you type in voting, you’ll be taken to this site. And this is a very bold effort. I think is sort of the biggest voting information effort in US history.

Monika Bickert:

Our goal is to help people register to vote and we’re targeting helping more than four million people register to vote and also to get people to register, to volunteer in helping with polling locations. And you’ll see when you go to the site, you can check whether or not you’re registered to vote. You can register to vote. You can also get information from local authorities about the details about how to vote in your area. So this morning I went on the center just to check it out and see what’s new and there were posts from the California attorney general. Those would of course have different if you were to look, it would be posts that are relevant to you in your area.

John Carlin:

And so it’s characters a little bit. Is that a way then of avoiding stepping into the difficult controversies of trying to figure out whether statements are true or not true or propaganda when it comes to voting and instead direct people to a site which is vetted to make sure that the information is accurate?

Monika Bickert:

The important thing is making sure that people have accurate information and we’re not going to be able to parse what is true and what is false especially when you think about how local some of this is. One of the things we discussed was what if people say, “Oh, this particular polling place, the line is two hours long or it’s really unsafe.” There’s not going to be a great way for us to have accurate information that will really allow us to make those sorts of decisions. So what we instead can do is just direct people to the most up to date sources where they can get that information. And that’s something, like I said, any post about voting, we’re going ahead and applying that links that people will be taken to the center.

John Carlin:

That makes sense. That’s been one of the things I worry about most when I think about if I were a nation state or terrorist group, I want to disrupt US elections would be putting out that type of misinformation right before people go to vote.

Monika Bickert:

Yeah. It’s something we’re really focused on to and also right after the election. This may be an interesting election because of the potential delay in the announcement of results because of how many ballots may be absentee or mailing. So we’re focused on leading up to the election. We’re also focused on what happens after and we’re working really closely with election officials to make sure that we’re being mindful of all the risks out there. I will say with this voter information center which I do really hope people will check out, we are drawing on the experience from launching our COVID information center. So we built this COVID information center where people could get up to date information from health authorities. I wasn’t sure if this would be wildly successful or not but it has been. We’ve connected more than two billion people to accurate information and more than 600 million of them have clicked through to get accurate information about COVID. So hopefully the voter information center will have the same kind of effect.

John Carlin:

Monika, it’s been great having you here today and it’s clear that you are sitting at the center of something new. I mean, we’re just wrestling with issues related to how to find out about a global pandemic, how to vote and in an age of overwhelming information sometimes misinformation through a company that’s just never existed on this scale before of over two billion people. It’s been wonderful having you here today and appreciate you talking us through how you approach these issues.

Monika Bickert:

Thanks so much. I really appreciate the opportunity to talk about it. I hope if there’s anything people take away from this it’s that even though we may not do everything perfectly, we’re really committed to getting it right and doing our part. So thank you.

John Carlin:

Cyber Space is presented by CAFE. Your host is John Carlin. The Executive Producer is Tamara Sepper. The Senior Producer is Adam Waller. The Senior Audio Producer is David Tatasciore and the CAFE team is Matthew Billy, Nat Wiener, Sam Ozer-Staton, David Kurlander, Noa Azulai, Jake Kaplan, Calvin Lord, Jeff Isenman, Chris Boylan, Sean Walsh and Margot Maley. The music is by Breakmaster Cylinder. Today’s episode was brought to you in collaboration with Brooklyn Law Schools, BLIP Clinic. Special thanks to Amanda Cadisch, Isabella Augusto, Alexa Panteliodis and Megan Smith for their help with research.