Politicians push for faster cull of online terrorist content
European leaders will suggest new targets for internet giants to remove extremist material at a UN meeting on Wednesday.
Katie CollinsSenior European Correspondent
Katie a UK-based news reporter and features writer. Officially, she is CNET's European correspondent, covering tech policy and Big Tech in the EU and UK. Unofficially, she serves as CNET's Taylor Swift correspondent. You can also find her writing about tech for good, ethics and human rights, the climate crisis, robots, travel and digital culture. She was once described a "living synth" by London's Evening Standard for having a microchip injected into her hand.
The meeting comes only a week after the UK suffered its fifth terrorist attack of 2017 in which 30 people were injured in an explosion on the London Underground. Groups like Daesh, another way of referring to the Islamic State, are accused by politicians of using the internet to radicalize terrorists and educate them on how to carry out attacks.
The British government, along with the EU and other political institutions, have repeatedly called upon internet companies to do more in the fight against terrorism. For their part, the companies have consistently tried to step up by funding research, establishing anti-extremism programs and putting artificial intelligence to work at identifying content automatically.
Twitter announced on Tuesday that in the first half of this year it had removed 299,649 accounts for the promotion of terrorism, 75 percent of which were pulled down before their first tweet.
"By funding experts like ISD, we hope to support sustainable solutions to extremism both online and offline," said Kent Walker, senior vice president for policy, legal, trust and safety at Google.org in a blog post. "We don't have all the answers, but we're committed to playing our part."
Walker will represent Google, as well as an industry body founded in June called the Global Internet Forum to Counter Terrorism, at the UN meeting on Wednesday. Along with representatives from Facebook and Microsoft, he will hear May call for a new target of one to two hours for removing terrorist material from the web -- this being the period in which most material is disseminated.
"Terrorist groups are aware that links to their propaganda are being removed more quickly, and are placing a greater emphasis on disseminating content at speed in order to stay ahead," the British prime minister will say.
The EU established a code of conduct earlier this year that the internet companies signed, promising to pull down illegal content within 24 hours of it being posted. Last week the European Commission was reported to be considering legislating on the issue if it wasn't happy with the progress made by spring 2018.
"Industry needs to go further and faster in automating the detection and removal of terrorist content online, and developing technological solutions which prevent it being uploaded in the first place," May will say on Wednesday. "This is a global problem that transcends national interests."