Defying Digital Hate
For women, abusive messages from strangers are simply the cost of using social media platforms. But according to Imran Ahmed, Founder & CEO of the Center for Countering Digital Hate, there is both hope and recourse. This week on This is Critical, Imran joins Virginia for a rousing and empathetic conversation about how to navigate – and eventually, end – this online “tax on women.”
Read a transcribed excerpt below, and listen to the episode here.
Virginia: So, like a lot of people in these halcyon days of catfishing, trolling, and cyberstalking, I get hit with online harassment campaigns, and not infrequently.
Now I am not Amber Heard, I am not Nancy Pelosi or Hillary Clinton or Christine Blasey Ford. I'm not at the center of some nationwide debate, and most people have no idea, have never never heard my name—but I'm another schmo that gets this stuff all the time.
I’m trying to be cool about it, but I can't tell you how stressful it is, how it wears you down to get hateful message after hateful message, not just about me, but about my family. How it feels to start calling the FBI, because people found my address and threatened to come to my actual house. Then there's always the possibility that I can get SWATed, which has happened to friends of mine. That means when someone calls the cops and gives out your address as a place where abuse or crimes are being committed so that they send a SWAT team, which holds up the police and you, and you spend the rest of the day explaining why you didn't do anything.
I find this exhausting, but I also have found that it's something, sadly, that I have gotten used to in years as a journalist. Fortunately, my guest today does not think we have to get used to this.
Imran Ahmed is the founder and CEO of the center for countering digital hate in the US and the UK.
Imran, welcome to This is Critical.
Imran: Thank you. It's really good to be here with you.
Virginia: Tell us about the organization you founded: The Center for Countering Digital Hate.
Imran: I founded the organization almost six years ago. In fact it was June the 16th, 2016 was the event that triggered it, the assassination of my colleague at the time, Jo Cox, who was a member of parliament in the UK, a mother of two, and she was killed by a far right terrorist who'd been radicalized online during the EU referendum. And you know, my awareness that the slogans the man was shouting as he killed her were taken straight from the internet, straight from misinformation and hatred flowing unabated in digital spaces—digital spaces on platforms that we were all familiar with. These were Facebook groups. These were things that we thought were, you know, knitting circles and people interested in, I don't know, Transformers or Marvel movies. And actually there were spaces where anti-semites felt they could radicalize and inform each other. And CCDH since then has grown. One of the things that's been really powerful for me is that lots of people have said that our analysis of the problem and our advocacy on behalf of victims groups—at the time it was looking specifically at white supremacist terrorism—but we, we started working women's groups on misogyny, on the undermining of sexual and reproductive rights. We're working at the moment with the LGBT groups on hatred and disinformation around gay people, around trans people. And what we’ve seen is there’s a solid core to the problem. And the solid core to the problem is that in digital spaces, there appear to be no rules, even though there are stated rules, when you all sign up to a platform, we all sign up to community standards. The problem is that they're not enforced. And so rules without enforcement mean that there are no rules in reality anyway, and we've done a ton of work on finding ways to make platforms feel they have to enforce the rules that we all agree to when we sign up to them.
Virginia: So I have a personal interest in this. Over my many, many years on the internet, I’ve been pretty relentlessly targeted with digital hate. I haven't had it as bad as some people, but I’ve had it worse than most people. I've had a limitless number of rape and gas chamber threats.
So when Trump was elected, I got a lot of antisemitic hate. My son wears a yarmulke, we’re associated with Jewish institutions. So I set my location to Germany, even though I stayed in the US— and I recommend this to anyone. You can set your location just at your choice on Twitter—and I chose a town in Germany that I’d only heard of, have never visited. And I ended up under the rubric of protections you get in Germany. It's sort of this little known way to avoid hate speech, because there are huge sanctions on Facebook and Twitter, if they let right wing hate stand. And I’ve also had Tucker Carlson’s followers come after me.
So I’m intimately familiar with all of this, sadly, but I don’t know as much about this from a research perspective, and I’d love to hear what your organization has learned.
Imran: Well, I mean, let me start by saying, I'm really sorry to hear that you’ve, not just suffered abuse for a long time, but that you've got to the stage now where you've, you've had to develop really complicated strategies and tactics for dealing with it. You are someone that's witnessed the normalization of hate being expressed in a way that it hasn't openly in offline spaces for a long time. But in online spaces, people feel empowered because of a lack of consequences for behaving in a racist or misogynist way to get away with it. And if you, if you face no consequences, then you believe that you are acting within impunity, and that's really, really dangerous. We know that it has an offline impact as well because those people feel empowered to then take actions offline, believing themselves to be able to act with impunity, believing that the lack of action gives them succor. And that's bad.
Virginia: Thank you for just saying those simple things because normalizing it and finding ways to stop myself from thinking just, I don't wanna fall into the illusion that everybody is threatening me with violence, but we'll talk about that.
Imran: I mean, the point that you make about shifting your location to Germany is a reflection both of your incredible awareness of the legislative frameworks around the world, the fact that Germany's [Network Enforcement Act] which is a law that prohibits hate speech online gives you the right to a hate-free experience online, something that the community standards already say that they will do. But in reality, we all know in practice that never happens. They don't enforce their standards.
But it's, it's a reflection of both your capacity to sort of see the opportunity there. But it's a chilling reminder that you only feel safe because you are protected by the German laws. Under US law, a Jewish woman would not feel safe on social media.
I speak to people who talk about the dynamics of power abuse and safety in creating spaces in offline geographies. So things like street lights are really powerful for making people feel safe when they're walking home from the train at night. Online, enforcing the rules that are already there are ways in which you essentially turn on the street lights. The problem is that Facebook don't want to turn on the street lights because it costs too much money. Which is the same reason local authorities give, they say we're gonna have partial night lightning. Well, you know Facebook's doing the same thing, but with digital spaces, they've turned all the lights off, and at night, every awful person out there feels empowered and normal people like you or me don't feel safe.
Listen to the full episode here.