Opinion | Social media outlets need to deplatform white supremacists

By Delilah Bourque, Senior Staff Columnist

Milo Yiannopoulos is $2 million in debt. The former Breitbart News editor lost a book deal with Simon & Schuster in 2017 after a podcast came to light, in which Yiannopoulos defended sexual relationships between adults and 13-year-old children. When financial backer Matthew Mellon died in April of last year and was no longer able to invest, Yiannopoulos was forced to lay off all of the staff at his company, Milo Entertainment Inc., as no one else was willing to support him.

Now, Yiannopoulos has resorted to making Christian rock music and selling his possessions via Instagram. He’s even banned from entering Australia after making comments attacking Islam following the Christchurch shooting.

All this came after Milo was banned permanently from Twitter in July 2016. Yiannopoulos’ views are nothing short of extreme, and one thing has certainly forced him to face the consequences of spreading extremist views: taking away his platform.

Yiannopoulos’ recent misfortunes shows that deplatforming him works to stop the spread of his hateful views. Social media companies like Google, Facebook and Twitter have already contributed to the reduction in attacks by ISIS on their platforms by taking away the voice of extremists, and we need to step up such actions against white supremacists and bigoted provocateurs.

Twitter took measures to remove content that threatens or promotes terrorism, particularly from ISIS, in early 2016. Prior to that time, only child pornography was automatically flagged by Twitter’s technology for human review. About 125,000 Twitter accounts were suspended in eight months in 2015 for violating Twitter’s terms of use, which were updated to explicitly prohibit content that supports or encourages terrorism.

A study by George Washington University showed that while Twitter was removing ISIS from its platform in 2016, white supremacists saw a 600-percent increase in followers since 2012. The study concluded that Twitter’s conscious efforts to remove content from the Islamic State made it more difficult for ISIS to reach more people. Meanwhile, white supremacists, who were not specifically targeted by content-review algorithms, saw a rise in their audience. Twitter currently does not filter content supporting white supremacy on a large scale.

Richard Spencer, the white nationalist who organized the 2017 neo-Nazi rally in Charlottesville, Virginia, that ended with the death of a counterprotester, lost his Twitter account in 2016. Though he subsequently regained access to Twitter, Spencer lost his account verification, a blue checkmark on a user’s Twitter page that indicates the account is an authentic one for a public figure.

Twitter Support described the reasoning behind deciding to revoke some verifications and reevaluate the verification process in a tweet thread from 2017.

“Verification has long been perceived as an endorsement,” one of the tweets read. “We gave verified accounts visual prominence on the service, which deepened this perception. We should have addressed this earlier, but did not prioritize the work as we should have. This perception became worse when we opened up verification for public submissions and verified people who we in no way endorse.”

Though he remains on Twitter without verification, Spencer has since canceled a controversial college tour, which he claimed was due to counterprotesting from leftist coalition Antifa.

“When they become violent clashes and pitched battles, they aren’t fun,” Spencer said in a March 2018 YouTube video of organized white nationalist rallies like the one in Charlottesville. “I really hate to say this, and I definitely hesitate to say this. Antifa is winning to the extent that they’re willing to go further than anyone else, in the sense that they will do things in terms of just violence, intimidating and general nastiness.”

Though Spencer blames groups like Antifa for being violent, the violence at the 2017 Charlottesville rally ended with a member of the alt-right being convicted of the murder of counterprotester Heather Heyer. Unite the Right 2, a follow-up rally to Charlottesville, took place in Washington, D.C., last year. Only about 25 people showed up.

Though there is no single factor that can definitively explain the decrease in rally attendance, reducing the voice of rally organizers like Spencer correlates with a reduction in public support for white supremacy.

The reduction of content on social media that supports the Islamic State has coincided with a reduction in terrorist attacks worldwide. A report from the Department of Homeland Security found that ISIS carried out 23 percent fewer attacks and caused 53 percent fewer deaths in 2017 when compared to 2016.

The online presence of white supremacist movements has likewise inspired real-life violence. The man charged with shooting and killing 50 Muslim worshippers in Christchurch, New Zealand, on March 15 wrote a 74-page manifesto against Muslims and immigrants that included references to online memes, like those propagated by Spencer and Yiannopoulos. The shooter allegedly posted the manifesto and livestreamed the shooting on social media platform 8chan, an online messaging platform with no content moderation.

Social media is often used as a recruitment tool to get vulnerable people to join extremist movements. Taking bigoted views off platforms like Twitter and Facebook has been proven to slow the momentum of major figures in hateful movements.

Nazi sympathizers and white nationalists like Spencer and Yiannopoulos each faded further into the background when they were removed from larger audiences. It’s time for Twitter, Facebook, Google and others to take the same measures in removing platforms for all white supremacists in order to stop their views from spreading further.