X

Politicians call for crackdown on social media 'toxic swamp' after mosque shooting

Australia's top politicians say tech companies are not doing enough to stop extremist content, with one saying Facebook has "gone missing" on hate speech.

Claire Reilly Former Principal Video Producer
Claire Reilly was a video host, journalist and producer covering all things space, futurism, science and culture. Whether she's covering breaking news, explaining complex science topics or exploring the weirder sides of tech culture, Claire gets to the heart of why technology matters to everyone. She's been a regular commentator on broadcast news, and in her spare time, she's a cabaret enthusiast, Simpsons aficionado and closet country music lover. She originally hails from Sydney but now calls San Francisco home.
Expertise Space, Futurism, Science and Sci-Tech, Robotics, Tech Culture Credentials
  • Webby Award Winner (Best Video Host, 2021), Webby Nominee (Podcasts, 2021), Gold Telly (Documentary Series, 2021), Silver Telly (Video Writing, 2021), W3 Award (Best Host, 2020), Australian IT Journalism Awards (Best Journalist, Best News Journalist 2017)
Claire Reilly
3 min read
A man sits outside the Masjid Al Noor mosque in Christchurch, New Zealand, following the mass shooting.

A man sits outside the Masjid Al Noor mosque in Christchurch, New Zealand, following the mass shooting. 

Marty Melville/Getty Images

Facebook has been accused of "going missing" when it comes to fighting hate speech and playing an "unrestricted role" in terrorist attacks, following two mass shootings at mosques in New Zealand on Friday.

The comments come from Australian Prime Minister Scott Morrison and the country's opposition leader, Bill Shorten, who haven't held back in criticizing the role technology companies have played in amplifying extremist views.

Both politicians warn that the internet has given a home to the kind of white supremacist hate speech espoused by the alleged mosque shooter, an Australian national, saying tech companies must do more to stamp it out.

In a letter to Japanese Prime Minister Shinzo Abe ahead of the upcoming G20 summit in Osaka, Morrison said internet technologies are playing an "unrestricted role" in the spread of extremism, and that world leaders must lay out "clear consequences" not only for those who carry out terrorist attacks, but also "for those who facilitate them." 

The letter was also sent to New Zealand Prime Minister Jacinda Ardern, who has said she plans on discussing the issue "directly with Facebook."

Watch this: Facebook deletes 1.5M videos after shooting, Democrats push ahead on Net Neutrality

But while Morrison did not mention the likes of Facebook , Twitter and YouTube by name, his political opponent, Labor leader Shorten, was more forthright.

"A platform like Facebook goes to potential advertisers and says, 'We know everything about the users of Facebook, we can tell you everything so that you can geo-target and you can market to them,'" Shorten wrote in an op-ed for the Herald-Sun newspaper. "Well, if that's your business model, fair enough; but you can't go missing when it comes to hate speech."

"Social media is a marvelous tool that has the potential to empower us, but too often it resembles a toxic swamp where wrongdoers can hide and where evil is nurtured," he added.

The comments follow a terrorist attack in New Zealand on Friday, when a gunman entered a mosque in central Christchurch and shot worshipers while they prayed, livestreaming the shooting on Facebook. The attack, which also involved a second shooting at another Christchurch mosque, claimed 50 lives. The alleged attacker, Brenton Harrison Tarrant, was an Australian national.

As New Zealand counts the cost of the deadliest mass shooting in New Zealand history, attention has turned to the role the internet and social media played in the attack.

While Facebook and Twitter deleted the alleged attacker's social media accounts within hours of the attack, footage of the shooting spread quickly. The roughly 17-minute live clip was downloaded from Facebook and reuploaded across the internet on sites such as YouTube, with some users editing out the more graphic content in an attempt to circumvent censors.

In a statement, a spokesperson for Twitter said the company was "committed to working and cooperating with governments around the world, particularly as it relates to safety and wellbeing" and that it has "rigorous and rapid response processes in place" for emergency situations.

Facebook has previously said it deleted 1.5 million versions of the video within the first 24 hours of the attack. 

But Morrison is calling for a tougher approach to weeding out extremist content on the internet, saying technology firms have a "moral obligation to protect the communities which they serve and from which they profit."

He added that social media companies, content service providers and gaming platforms all had a part to play to keep communities safe.

"We know that violent extremists use the internet for recruitment, radicalisation and to carry out their evil acts," the prime minister's letter reads. "That they will continue to try to use any means their disposal does not mean governments and technology firms should abrogate their responsibilities to keep our communities safe."

Facebook did not respond to a request for comment.

Originally published March 18 at 5:32 p.m. PT.
Update on March 18 at 9:27 p.m.: Adds comments from Australian opposition leader Bill Shorten.
Update on March 19 at 4:02 p.m.: Adds comment from Twitter.