The man who stormed the Tree of Life synagogue building on Oct. 27, 2018, murdering 11 congregants in the midst of Shabbat prayer, was an active user of the social media site Gab. His Gab bio said that “jews are the children of satan,” and his banner image was an unambiguous reference to a white supremacist meme. His final post, just prior to the massacre, read: “Screw your optics, I’m going in.”
In the weeks and months following the Pittsburgh synagogue shooting, many pundits urged social media sites to tamp down on racist, violent and anti-Semitic accounts. The Anti-Defamation League implored social media companies to clarify their terms of service to address hateful content — or at least make it harder to find online — and to not allow hateful content to be monetized for profit.
“There are 24/7 rallies online,” Jonathan Greenblatt, CEO and national director of the ADL told the Chronicle one year following the Pittsburgh synagogue shooting. “With just a few clicks you can literally find what was previously unspeakable. Social media has become a breeding ground for bigotry. Some of these businesses, like Facebook, have taken some steps … YouTube has adopted some important measures, but they need to do much, much more.”
Following the Jan. 6 Capitol riots, which left five people dead, the social media giants took serious steps against thousands of accounts they deemed to be potentially dangerous. Twitter suspended more than 70,000 accounts linked to the QAnon conspiracy theory, whose followers believe Donald Trump is secretly saving the world from a cabal of Satanic pedophiles and cannibals, and who traffic in anti-Semitic tropes. Adherents to QAnon were among the pro-Trump mob that stormed the Capitol.
Trump was permanently suspended from Twitter “due to the risk of further incitement of violence,” Twitter announced after the riot. Other social media platforms, including Facebook and Instagram, also suspended Trump’s accounts, as well as the accounts of some pro-Trump attorneys and other associates.
In the wake of the suspensions from mainstream social media, many conservative and far-right voices moved to the platform Parler — which calls itself a pro-free speech alternative — and where scores of extremists lurk as well.
Then, Apple, Google and Amazon removed Parler from their platforms.
According to a 2019 study by New York University researchers, cities with a higher incidence of a certain kind of racist tweets reported more hate crimes related to race, ethnicity and national origin. “This trend across different types of cities (for example, urban, rural, large, and small) confirms the need to more specifically study how different types of discriminatory speech online may contribute to consequences in the physical world,” said Rumi Chunara, the assistant professor of computer science and engineering at NYU who led the research.
Shutting down social media accounts and sites, though, does not halt the spread of violent rhetoric, anti-Semitism or other hateful ideologies, according to law enforcement experts. In fact, the removal of these accounts from popular platforms — or the platforms themselves — is problematic.
“When you limit these types of accounts, what happens is the folks who are using these various platforms to communicate will simply jump to another platform,” said Shawn Brokos, director of community security for the Jewish Federation of Greater Pittsburgh. “We see that all the time in law enforcement.”
While the movement from one platform to another “doesn’t hamper investigative abilities,” she said, “it certainly makes it more challenging.” She analogized tracking extremists moving from one online platform to another to a game of “whack-a-mole.”
“But that’s what law enforcement has to do,” she said. “They have to pick up from one area and go to the other area. But they do it and they do it well.”
From a security perspective, there is no benefit to removing extremists from social media, according to Brokos.
“Absolutely not,” she said. “There is no amount of limiting or shutting social media down that will stop the anti-Semitic ideology that is out there. And some would argue that limiting it feeds exactly into that anti-government, anti-authority and anti-Semitic ideology.”
Kathleen Blee, a professor of sociology at the University of Pittsburgh who has conducted in-depth studies on white supremacism, and who is a member of Dor Hadash Congregation — one the three congregations attacked during the shooting at the Tree of Life building — agreed there are “downsides” to moving people off sites that are relatively easy to monitor and where people understand that they are being monitored, which can have some moderating effect.
“It’s moving people into these end-to-end encrypted — and really the cesspool of the internet —sites that are just vehicles for the most horrific white supremacist, violent, anti-Semitic, anti-immigrant views,” Blee said. “So, that’s a problem, obviously. And it’s hard to monitor what individuals are doing on them — it’s not hard to monitor them in the aggregate, but it is hard to pin anything to an individual user.”
As of last week, usership of apps favored by extremists had skyrocketed, Blee said. Users on Signal had increased 677% and Telegram was up 146%.
“That’s a problem,” Blee said. “These places are slippery. And Telegram and Signal are very much open to hosting these kinds of very violent, white supremacist conversations. So there are some downsides.”
On the other hand, Blee said, when more open sites, like Parler, close down, there is usually some attrition.
“For one thing, some people will not want to gravitate from the level of what was being expressed on Facebook or even Parler, to the next step toward violence,” she said. “And you are also going to lose some people because, as you get into some of these, they become more and more difficult to access and require more technological knowledge.”
Some extremists silenced on mainstream social media will gravitate to the darknet, a part of the internet hosted within an encrypted network and accessible only through specialized anonymity-providing tools, said Brokos.
“A lot of the darknet is used for criminal purposes: the sale of drugs, the sale of illegal goods, the sale of weapons,” Brokos said. “And if you look at any one of those ‘marketplaces,’ you will find all sorts of anti-government, anti-authority type ideology. In my opinion, from a security perspective, it goes far beyond Twitter and Facebook and these clearnet platforms.”
Anti-government groups linked to anti-Semitism emboldened
Another downside to moving users off mainstream platforms is that it “empowers these anti-government extremists and furthers their cause that the Zionist government is out to get them,” Brokos said. “For these racially motivated violent extremists, there is this inherent belief that there is a Zionist government that is trying to control everybody and that the Jews are behind a lot of that. That ideology does exist.”
In fact, that ideology may have motivated the Pittsburgh synagogue shooter.
“What seems to have happened with him very much fits the pattern we see in other kinds of racially motivated violence,” said Blee, who stressed she has no information about the shooter beyond what has been reported by the media. That pattern consists of three stages: identification of a threat; identification of a target responsible for that threat; and a sense of urgency.
“First, there’s a sense of some enormous existential threat out there,” Blee explained. “If you think of the 1980s and ’90s, when the white supremacists became significant in this country, the existential threat was banking and farm foreclosures — it was the beginning of the militia movement and really the resurgence of anti-Semitism in a very public way. That was the existential threat: Jews held a stranglehold over the economy and were ruining the lives of white farmers was kind of the message there.”
These days, the existential threat is more commonly posed as white genocide or “the Great Replacement Theory — that whites will become the minority and lose power,” Blee said.
After an existential threat is perceived, the next precursor to racially motivated violence is identifying the person or group responsible for that threat, she continued.
“In the Pittsburgh shooting, the threat was white genocide and the target was George Soros — so there you have an amplification by politicians of the same message that’s being spread on Gab and by other white supremacists online.”
To white supremacists, “George Soros” signifies “Jews,” Blee said, “and they all understand that. George Soros is to white supremacists what Rothschild was a couple decades ago. Probably most of these people couldn’t tell you who George Soros is — just an image that stands in for Jews writ large, Jewish control.”
After identifying the threat and the target, the third stage is a “sense of urgency,” Blee said.
“That’s the final trigger. You can’t just wait around and mobilize yourself for the threat, you have to act now. That’s the message, and so that’s why that message of invasion, that people are about to come over the border, that is a dangerous thing.”
“It’s pernicious in any form,” Blee said. “When it’s happening on the internet all over the place, when it’s amplified in public, when there is an echoing of what’s happening on places like Gab and what’s showing on TV, that’s particularly dangerous.”
While shutting down extremist voices from mainstream social media will do little to shore up security, shutting down Trump’s use of social media “as a megaphone” in the days after the Jan. 6 riots and before the inauguration was important, Blee said.
“I think in the short run, that outweighs everything,” she said. “He was clearly providing an accelerant to these conversations and actions.”
The Anti-Defamation League also condoned Trump’s ban from social media after the Capitol riots, and in fact, called for Facebook, Twitter and other social media giants to permanently remove Trump from their platforms the afternoon of Jan. 6.
Those seeking to “spread fraudulent or completely debunked claims that undermine our democracy, and encourage mobs of people to overrun an election and to storm our nation’s Capitol, have no right to do so on social platforms,” James Pasch, regional director of the Anti-Defamation League’s Cleveland office, which serves Pittsburgh, told the Chronicle.
“I would argue that social platforms not only have no obligation to amplify those voices, they actually have a moral and ethical obligation — and in some cases a legal obligation — to stop such incitement to violence,” Pasch said. “A full ban on Trump’s social media accounts have an immediate effect of preventing him from spreading misinformation to Americans. And in turn, it prevents him from inciting further damage to our democracy and the process that needs to be in place for a peaceful transition for our country.”
While the ADL feels strongly that everyone should have the right to express themselves, Pasch added, “incitement of violence is not a protected right anyway. And freedom of speech does not mean freedom of incitement to violence. So we view it as a step in the right direction.”
Lies v. incitement
But there is a distinction between posting lies on social media and posts that incite violence, said Bruce Ledewitz, a professor of Constitutional law at Duquesne University School of Law. And it is dangerous, he said, when powerful private companies become the arbiters of speech.
“Progressives think it’s great when the NFL threatens Texas and North Carolina over a transgender bill that doesn’t protect transgender rights, but one day, the NFL will threaten California and New York if they don’t lower their corporate income tax,” said Ledewitz. “It’s too much power in the hands of private companies and I don’t trust private businesses. I don’t want them deciding speech issues. I don’t want companies to withdraw investment because they don’t like somebody’s politics.
“In the end, that’s got to hurt marginal people,” Ledewitz continued. “Power is power. There is just too much power in the hands of private companies like Twitter and Facebook. And that’s more of an anti-trust issue, but the fact that they have so much influence on how Americans communicate is not a good thing.”
The other problem Ledewitz has with social media regulating speech is the broader issue of suppression.
“In Europe, it’s a crime to deny the Holocaust,” Ledewitz said. “In America, it is protected speech. So where does the Holocaust get denied? Europe. In other words, suppressing ideas generally doesn’t work. The law tries to distinguish between suppressing ideas, which is unconstitutional under the First Amendment, and punishing actual incitement to violence. I think it’s a good distinction. A lot of what the [Pittsburgh synagogue] shooter was reacting to was actual incitements to violence, and that is criminal and has always been criminal, and remains criminal under the First Amendment. But what we’re talking about now is the propagation of lies. And I don’t think the way you deal with that is by suppression. I think you expose it.” PJC
Toby Tabachnick can be reached at email@example.com.