The legislation, developed in the wake of the Christchurch mosque shooting and passed in the Australian parliament on Thursday local time, targets social networks with hefty fines and even jail terms for company executives if they fail to take down "abhorrent violent material" shared on their platforms.
The laws are unique in that they target social networks and internet service providers, rather than the original users uploading violent material, and set out individual liability for material shared on an entire platform.
Imprisonment for executives
Under the laws social networks, internet service providers and "content service providers" are required to ensure the "expeditious removal" of violent material, and notify the Australian Federal Police "within a reasonable time" after becoming aware of its existence. Abhorrent violent material is defined as audio or visual material that includes footage or audio of torture, terrorist acts, murder, rape and kidnapping.
The penalties include up to three years jail time for individuals and fines of up to 10 percent of a company's annual turnover.
The laws are designed to stop the kinds of widespread sharing of violent videos and images, similar to what occurred during the Christchurch mosque shooting. During that terrorist attack, the suspected shooterof the violence on Facebook, which was on social networks.
Australian Attorney-General Christian Porter described the laws as "most likely a world first" in terms of legislating the conduct of social media networks.
"There was a near unanimous view among Australians that social media platforms had to take more responsibility for their content… and the law should prevent them from live streaming or playing acts of the most horrendous violence," he said.
The Attorney-General also called out specific platforms including 4Chan, Facebook and YouTube, saying specific individuals at social media companies could be targeted under the laws.
"If you look at an organisation like 4Chan, which is a hosting service, that was created, owned and operated by an individual," he said. "With respect to the larger platforms like Facebook, YouTube, there could be instances where an individual is so complicit with the reckless availability of violent material that they would be individually liable."
Law 'does nothing to address hate speech'
The laws have been slammed by companies including Facebook, Google and Twitter.
In a statement, the Digital Industry Group Inc (which includes the above companies as well as Amazon and Verizon Media) said the legislation was rushed and "does nothing to address hate speech."
"No one wants abhorrent content on their websites, and DIGI members work to take this down as quickly as possible," the statement read. "But with the vast volumes of content uploaded to the internet every second, this is a highly complex problem that requires discussion with the technology industry, legal experts, the media and civil society to get the solution right -- that didn't happen this week."
Digital rights groups also condemned the swift passage of the legislation, which was introduced to the Senate late on Wednesday and passed through the House of Representatives without debate in a matter of hours.
In a statement, Digital Rights Watch said the laws were "poorly designed" and "hastily drafted," and could encourage online companies to "constantly surveil internet users."
"Forcing companies to regulate content under threat of criminal liability is likely to lead to over-removal and censorship as the companies attempt their best to avoid jail-time for their executives or hefty fines on their turnover," the group said.
The laws do not define what constitutes a "reasonable time" for the take down of violent material. When pressed in a doorstop interview on Thursday, Attorney-General Porter pointed to the Christchurch shooting video but did not elaborate on the definition of timeliness.
"I can't precisely say what would have been the point of time at which it would be reasonable [to take the video down]," he said. "[But] it was totally unreasonable that it should exist on their site for well over an hour without them taking any action whatsoever."
For its part, Facebook previously said that the live-stream of the shooting had than 200 viewers during the live broadcast and that the video got roughly 4,000 views before it was taken down. The company then within the first 24 hours, 1.2 million of which were removed at the point of upload.
4Chan has been contacted for comment.
Originally published April. 4, 1:22 p.m. AEDT