apps were flooded with violent content and misinformation last week as supporters of President Donald Trump stormed the US Capitol, according to a report Thursday from The Wall Street Journal that offers an inside look at how the social network grappled with content decisions amid the deadly riot on Jan. 6.
User reports of violent content reportedly jumped more than 10-fold on the morning of the Capitol riot and reports of false news were nearly four-times higher than recent daily peaks, according internal documents seen by the Journal.
also reportedly saw views jump significantly for users in "zero trust" countries, a sign of potential foreign manipulation.
By midafternoon on Jan. 6, Facebook reportedly made the decision to classify the US as a "temporary high-risk location" for political violence, triggering emergency measures to limit content that could lead to real-world violence. The next move the company made that day was blocking Trump from posting for 24 hours on Facebook and Instagram. The following day, Facebook went much further, blocking Trump on both sites "indefinitely" -- or for at least two weeks.
Twitter, Snapchat, YouTube and other sites have also banned or suspended Trump from their platforms over concerns the president's remarks could incite more violence before or after Joe Biden's inauguration as the next US president on Jan. 20. The FBI and Capitol Police have reportedly warned that armed protests are being planned across the US and in Washington, DC, in the leadup to the inauguration.
Facebook didn't immediately respond to a request for comment.