The Southern Poverty Law Center began in 1971 by defending, mostly, African-American victims of the Ku Klux Klan but now tracks the activities of extremist groups of all kinds: neo-Nazis, white nationalists, anti-Muslim, anti-LGBT, neo-Confederates, and more. Its 2018 annual report counts 1,020 “active hate groups” in the United States, an all-time high. Fourteen of them are in Massachusetts.
What are we to do with these radical cells that spew hatred in marches and manifestos online? After the massacre of Muslims in New Zealand mosques last week — telegraphed on the Web and live-streamed on Facebook — pressure on social media companies to shut down such accounts is intense. The Washington Post’s thoughtful media critic, Margaret Sullivan, wrote a blistering column Friday, saying the mosque assailants “depended on the passive incompetence” of social media platforms to carry out their plot.
But we are learning it isn’t easy to put the Internet genie back in the bottle. Over the weekend, Facebook said it had removed 1.5 million videos of the New Zealand attack, but that clearly wasn’t enough to stop trolls from uploading repackaged clips; archived versions remain available. President Recep Tayyip Erdogan of Turkey even showed part of it at a campaign rally to shore up support within his Muslim majority. Short of shutting down all search capabilities, which might be a reasonable response after particularly heinous violent acts, sanitizing the Web could be a futile pursuit.
The Internet is also littered with the second cousins of extremists: conspiracy theorists involving vaccines, fluoride, radio frequency chips, you name it. They may seem benign compared to violent hate groups, but their views are still unhinged, untrue, and a menace to public health and social comity. The rise of anti-vax postings, despite being thoroughly debunked by rigorous science, has been linked to new measles outbreaks in the United States. For the first time, the World Health Organization has listed “vaccine hesitancy” among the top 10 global health threats, right up there with dengue fever and HIV. Should YouTube, Facebook, and the rest also suppress them?
This isn’t strictly a First Amendment issue, because Facebook and the other platforms are private companies, not government entities restricting speech. Still, the growing power of a few communications giants — a recent PEW study found 68 percent of the population now gets news from social media, even though more than half expect much of it to be false — raises legitimate concerns. We have to ask: Do we really want these tech behemoths deciding what is disfavored opinion, based on public outrage or, worse, risk to their profits?
Even if it were possible to eliminate all violent, hateful rhetoric from social media, there is still an entire universe of books, websites, and documentary-style films to flog these odious views. You don’t have to be a card-carrying member of the ACLU to see it would be dangerous, as well as impractical, to try to expunge them all.
A concerted effort to muzzle extremist views could also backfire, fueling the sense of grievance and persecution that drives the alienated young men that make up most hate groups. “This anger is heightened as tech companies deplatform extremists, which further frustrates those convinced they are being driven to extinction.” So wrote an analyst for the Southern Poverty Law Center in their report.
Part of why the issue is so difficult is that social media is a relatively new beast, and we haven’t yet decided what kind of forum it is. Is expressing an anti-vaccine or anti-Muslim view on Twitter like yelling “fire” in a crowded theater, which would argue for restraint, or is it closer to just ranting on the town common? The promise of the Internet, after all, was the robust democratization of information and points of view. It clearly needs more policing, but once you wade into the marketplace of ideas, you have to expect someone will try to sell you rotten apples.
At a time when distrust of expertise is rampant and social media accelerate hoaxes of all kinds, the impulse to restrict fanatical speech is understandable. But, with a few exceptions, it’s generally better to combat loathsome speech with more speech. Advertisers could vote with their wallets; readers with their comments or (lack of) clicks. The worrisome alternative could be driving the hate groups further underground to the darkest corners of the dark Web, where even the Southern Poverty Law Center can’t find them.Renée Loth’s column appears regularly in the Globe.