Democracy Dies in Darkness

Feminists have been warning the world about ‘incels.’ Ignoring online misogyny has deadly consequences.

Hate speech is not simply words, it’s violence

Perspective by
May 1, 2018 at 12:31 p.m. EDT

A few days after last Monday’s devastating Toronto van attack, a Facebook post by alleged attacker Alek Minassian came to light, indicating he may have been an “incel,” or an involuntary celibate. The “incel” community seems to have started in July 2016. Based on a grand pessimism that their genes and unattractiveness will prevent them from ever having sex, “incels” bond over complaints about women and revenge fantasies, while also ostensibly supporting one another. Quickly gaining around 40,000 followers in the next 1½ years, the r/incels community grew until Reddit banned it on Nov. 7. But it moved elsewhere, it started another subreddit, it continued to grow.

Who are ‘incels’? Behind the misogynistic ideology that inspired the Toronto rampage suspect.

If Minassian was indeed radicalized in this community to hate “normies” — mainstream people who the incels presume have sex — then how can we prevent the rampage in Toronto from inspiring other real-life attacks? On "incel" forums, the Toronto rampage has inspired both fear that exposure will shut this community down — and excitement that the “incel rebellion” has begun. If Minassian was inspired by 2014 Santa Barbara, Calif., shooter Elliot Rodger, as Minassian’s Facebook post would indicate, who’s to say who will in turn be inspired by Minassian himself?

Feminists have been warning the world about incels and men’s rights activists for a while. The Southern Poverty Law Center recently added the extremism of male supremacy to the ideologies it tracks, noting that it is fundamental to the “racist ‘alt-right’ and in many ways served as its ‘gateway drug.’” When I wrote in November about how such groups can indoctrinate misogyny, I was doxed (that is, my public information was shared on the Internet) and harassed, perhaps ironically proving my point.

The Red Pill and other “pickup artist” forums treat women as objects in a predatory game. “Men’s rights activists” who decry feminism do not actually face any oppression from those women — even though one former Google employee thought “diversity” was special treatment, and no matter that his followers think his subsequent dismissal over a memo he wrote was a violation of his free speech rights. MRA groups function on an idea of essential biological differences that dehumanize women: The Google engineer thought women were biologically unsuited for his chosen work, while incels think women (often called “femoids”) are biologically wired only to have sex with mainstream men they refer to as “Chads.” They think that they themselves are biologically destined for celibacy. Men’s rights activism is a reaction to feminism — not in support of equality but a tactical move to undermine women’s rights.

Because other gender minorities including trans and nonbinary people would confound this issue, incels entirely leave out queer people, saying that they have “easier access” to sex. In order for their claim of victimhood to work, sex has to be a binary — otherwise, it would be exceedingly clear that these men aren’t advocating for their own rights but rather actively working against other people having equal rights. (I like to call this “protagonist anxiety” — a generalized, violent fear that you may not be the only main character in the narrative.)

On the surface, MRA groups seem innocuous, almost progressive: support groups for men! Men (and everyone else) should be able to find community to talk about their inability to find partners — silencing that would only contribute to toxic masculinity. But the foundation that these groups are based on is flawed. Men, as a group, are not oppressed. Incels act as if women choosing not to have sex with them is systemic persecution, but of course it isn’t. Their groups find common ground over a shared sense of entitlement to women’s bodies. By dehumanizing women, incels create an echo chamber where their casual misogyny can be amplified. (Ironically, the term word was coined by a woman who was trying to understand loneliness.)

Censoring incel and other “manosphere” groups is tricky territory: People have the right to free speech and free association. Free speech on the Internet is a much-championed cause. But we need to examine the power dynamics of said freedoms, and the effects of speech.

We need to broaden our definition of hate speech, examine the radicalization effects of unfettered free speech, and examine the role of the Internet in fostering violent ideation.

Look at what happens when the far right finds itself confronted by speech it doesn’t like: A Nazi group burns swastikas outside of a small Georgia town that dared to take a stand against their rally. Conservatives launch an online attack to rob a professor of livelihood and reputation for saying she didn’t care about Barbara Bush’s death. Women are banned from Facebook for saying men are trash. A young man is radicalized into hatred that knows no aim, a hatred for a mainstream culture that seemingly rejected him and denied him sexual access.

If Minassian’s links to incel groups do turn out to be a factor in the attack, will we finally address the growth of misogynist groups online? The Charlottesville violence last summer led several Internet businesses to disavow white supremacists, leading people to question whether tech companies are abusing their powers to suppress freedom of speech. The First Amendment does not apply to private companies — but these companies are so powerful that their decisions do have political consequences. PayPal began to prohibit donations that “promote hate, violence and intolerance.” GoDaddy and Google refused to host the Daily Stormer, a neo-Nazi website.

Some in power read any crackdown on hateful speech as just liberal politicking. On Thursday, the House Judiciary Committee held a hearing on “Filtering Practices of Social Media Platforms” as part of an investigation into whether social media “hates” conservatives. Republican politicians have been accusing Facebook and other platforms of censoring right-wing users. But as the Verge points out, “censorship” is a fuzzy term, because in many cases, users are not banned outright for far-right leanings. The exception, according to the Verge, is “genocidally racist far-right political movements” including Atomwaffen, a supremacist group that has been linked to several murders. In cases of violence that are seemingly digitally inspired — like conspiracy theorist Lane Davis, who killed his parents, or Elliot Rodger and his misogynist manifesto — the task of addressing online indoctrination may seem insurmountable, fraught with questions of freedoms and empty threats, making mental health a convenient scapegoat instead. This reduces violence to an individual level, and it ignores the culture in which these behaviors were acceptable.

So how do you address the complexities of the “bad influence” of the Internet?

Radical accountability is where we start. Giphy was recently booted by Snapchat and Instagram after an explicitly racist GIF outraged users — and Giphy responded by cleaning out its database, signifying a shift in how we hold social media platforms accountable. Even as Facebook testifies before Congress and Twitter cracks down on bots, users need to ask the companies making money off our data to publicly answer for offensive content that’s been flagged to them. Not for legal but for moral reasons. That kind of accountability would start with public human moderation — expert and humane. Moderators could publicly address questionable content by sharing their reasoning with users, much like many subreddit and Facebook group moderators do, as small subcommunities inherently call for more accountability. The burden of reporting content should not fall only on users — Facebook recently sent out a survey implying that it was the burden of users to identify pedophilia— but “call-out culture” did lead to the racist GIF being taken seriously. Our participation in online culture is thus increasingly vital.

Virulent fantasies online are not simply ideation, and definitely not “just” an opinion. In order for us to “fix” the problem of misogynists indoctrinating and organizing and inspiring future rapists, murderers and terrorists online, we have to come to a consensus to prioritize the rights of women and gender minorities. We’ve done that in other cases: When the time came, social media platforms have addressed terrorism, like when Twitter found all those Islamic State accounts.

Misogyny is terror. Dehumanization enables it. The space between freedom of speech and hate speech — the thin legal sliver and broad moral ground — is one where our future safety lies. It is the battleground for what our society and our morality will look like. Hate speech doesn’t exist in a vacuum. It’s not simply words, it’s violence.

Aditi Natasha Kini is an essayist and multimedia artist based in Brooklyn.

This essay originally appeared in The Washington Post.