Substack says it will not ban Nazi or extremist speech

Under pressure from critics who say Substack profits from newsletters that promote hate speech and racism, the company’s founders said Thursday they would not ban Nazi symbols and extremist rhetoric from the platform.

“I just want to make it clear that we don’t like Nazis either – we would like no one to have those views,” Hamish McKenzie, co-founder of Substack, said in a statement. “But some people hold these and other extreme views. Having said that, we don’t think that censorship (including through the demonetization of publications) solves the problem, on the contrary, it makes it worse.”

The answer came weeks later The Atlantic found that at least 16 Substack newsletters had “overt Nazi symbols” in their logos or graphics, and that white supremacists had been allowed to post on the platform and profit from it. Hundreds of newsletter writers signed a letter opposing Substack’s position and threatening to leave. About 100 others signed a letter supporting the company’s position.

In the statement, Mr. McKenzie said that he and the company’s other founders, Chris Best and Jairaj Sethi, concluded that censoring or demonetizing publications would not solve the problem of hateful rhetoric.

“We believe that upholding individual rights and civil liberties by subjecting ideas to open debate is the best way to strip bad ideas of their power,” he said.

This stance sparked waves of outrage and criticism, including from popular Substack writers who said they were uncomfortable working with a platform that allows hateful rhetoric to fester or thrive.

The debate renewed questions that have long plagued tech companies and social media platforms about how content should be moderated, if at all.

Substack, which takes a 10% cut of revenue from writers who charge for newsletter subscriptions, has faced similar criticism in the past, particularly after allowing some writers to use transphobic and anti-vaccine language.

Nikki Usher, a communications professor at the University of San Diego, said many platforms are facing the so-called “Nazi problem,” which states that if an online forum remains available long enough, extremists will be there. at a certain point.

Substack is establishing itself as a neutral content provider, Professor Usher said, but that also sends a message: “We’re not going to try to control this issue because it’s complicated, so it’s easier not to take sides.”

They have more than 200 writers publishing newsletters on Substack signed a letter oppose the company’s passive approach.

“Why do you choose to promote and allow the monetization of sites that traffic in white nationalism?” the letter said.

The authors also asked whether part of the company’s vision of success included giving a platform to hateful people, such as Richard Spencer, a prominent white nationalist.

“Let us know,” the letter said. “From there each of us can decide if this is still the place we want to be.”

Some popular writers on the platform have already vowed to leave. Rudy Fosterwho has more than 40,000 subscribers, wrote on Dec. 14 that readers often tell her they “can’t stand paying for Substack anymore” and that she feels the same way.

“So here’s a 2024 where none of us will!” she wrote.

Other writers defended the company. A letter signed by about 100 Substack writers says it’s best to let writers and readers moderate content, not social media companies.

Elle Griffinwho has more than 13,000 subscribers on Substack, wrote in the letter that while “there is a lot of hateful content on the Internet,” Substack has “found the best solution yet: giving writers and readers free speech without standing out.” . that speech to the masses.”

He argued that subscribers only receive newsletters they sign up for, so they are unlikely to receive hateful content unless they follow them. That’s not the case with X and Facebook, Griffin said.

She and others who signed the letter in support of the company emphasized that Substack is not actually one platform, but thousands of individualized platforms with unique, curated cultures.

Alexander Hellene, who writes science fiction and fantasy stories, signed Ms. Griffin’s letter. In a post on Substacksaid that a better approach to content moderation is to “take matters into your own hands.”

“Be an adult,” he wrote. “Block people.”

In his statement, Mr. McKenzie, co-founder of Substack, also defended his decision to host Richard Hanania, president of the Center for the Study of Partisanship and Ideology, on Substack’s podcast “The Active Voice.” The Atlantic reported that Mr. Hanania had previously described black people on social media as “animals” who should be subject to “more policing, incarceration and surveillance.”

“Hanania is an influential voice for some in US politics,” McKenzie wrote, adding that “it is useful to know her arguments.” He said that at the time he was not aware of Mr Hanania’s writings.

McKenzie also argued in his statement that censoring hateful ideas only spreads them.

But research In recent years suggest THE opposite to It is true.

“Deplatforming appears to have a positive effect in decreasing the spread of far-right propaganda and Nazi content,” said Kurt Braddock, a communications professor at American University who has studied violent extremist groups.

When extremists are removed from one platform, they often go to another platform, but much of their audience doesn’t follow them and their incomes eventually decline, Professor Braddock said.

“I can appreciate someone’s dedication to free speech rights, but free speech rights are dictated by the government,” he said, noting that companies can choose the type of content they host or ban.

While Substack says it doesn’t allow users to incite violence, that distinction can also be murky, Professor Braddock said, because racists and extremists can get to the bottom of it without doing so openly. But their rhetoric can still inspire others to violence, he said.

Allowing Nazi rhetoric on a platform also normalizes it, he said.

“The more they use the kind of rhetoric that dehumanizes or demonizes a certain population,” Professor Braddock said, “the more acceptable it becomes for the general population to follow it.”