Advertisement

Twitter, Facebook lock down Trump after social media-fueled riot in D.C.

Supporters of President Trump climb the west wall of the U.S. Capitol on Wednesday in Washington.
Supporters of President Trump climb the west wall of the U.S. Capitol on Wednesday in Washington.
(Jose Luis Magana / Associated Press)
Share via

The violent mob that stormed the U.S. Capitol on Wednesday seeking to prolong the presidency of Donald Trump took shape on social media.

Facebook, Twitter and their social media peers spent Trump’s term in office lurching from one crisis to another, scrambling to revise their policies on misinformation, hate speech and incitement to violence in response to ever-escalating challenges from the White House and prominent figures and organizations that support the president.

Wednesday presented another test: a rally, planned largely on their own platforms and promoted by the president, to protest the supposed theft of the presidential election and disrupt the final certification of the electoral college vote.

Advertisement

Trump supporters gather in the U.S. capital to protest the ratification of President-elect Joe Biden’s electoral college victory over President Trump.

But when that rally turned violent — as experts in online extremism warned it would — and spilled into the Capitol building itself, the social media giants appeared as unready as ever. Although they closed out the day by taking their strongest enforcement actions ever, including temporary locks on Trump’s Twitter and Facebook accounts, critics say the companies’ pattern of tentative half-measures helped precipitate a crisis for U.S. democracy.

“Blame for the violence today will appropriately fall on Trump and his enablers on Capitol Hill and in right-wing media,” said Roger McNamee, a tech investor and early advisor to Facebook Chief Executive Mark Zuckerberg. “But internet platforms — Facebook, Instagram, Google, YouTube, Twitter, and others — have played a central role.”

McNamee’s argument that internet platforms amplify hate speech, disinformation and conspiracy theories, while only selectively enforcing their terms of service, resonated with others, who called for the platforms to ban the president.

Advertisement

Alex Stamos, a Stanford professor and former Facebook chief security officer, said policies the platforms have relied on in the past no longer suffice. “There have been good arguments for private companies to not silence elected officials, but all those arguments are predicated on the protection of constitutional governance,” Stamos wrote on Twitter. “Twitter and Facebook have to cut him off.”

A few hours later, Twitter and Facebook decision makers came to the same conclusion. Trump, after encouraging the crowd to protest at the Capitol, returned to the White House and posted a video reiterating his false claims of election fraud while urging his supporters to remain peaceful.

At first, Twitter added a label to Trump’s video, noting in small letters that Trump’s claims were “disputed” and that retweets and likes on the post would be restricted “due to a risk of violence.” But later in the afternoon, the company deleted the post containing the video entirely, along with two other posts from Wednesday, marking the first time the platform had fully deleted anything posted by the president.

Advertisement

Just after 4 p.m., the company took an additional step, announcing that the president’s personal Twitter account would be locked for 12 hours, with the lock expiring only if he deleted the three posts that it had already removed from public view. If he refused to delete those posts, then his account would remain locked indefinitely, the company said.

Other social media platforms took different tacks. YouTube took down the video. “As the situation at the United States Capitol Building unfolds, our teams are working to quickly remove livestreams and other content that violates our policies, including those against incitement to violence or regarding footage of graphic violence,” YouTube spokesman Farshad Shadloo said in a statement.

Shortly after that, Facebook also took down the video. “We removed it because on balance we believe it contributes to rather than diminishes the risk of ongoing violence,” tweeted Guy Rosen, vice president of integrity at Facebook. Later in the evening, the company announced that it was suspending the president’s page for 24 hours.

In a blog post, Rosen said the company’s response would include searching for and removing content that supported the storming of the Capitol, called for armed protests, called for protests violating the 8 p.m. curfew in Washington, D.C., or attempted to “restage violence tomorrow or in the coming days.”

Rosen outlined a slate of recent and planned measures aimed at combating the spread of hateful and violence-inciting groups and content.

But the Wednesday event had been organized on social media platforms for months. One Facebook group, Red State Secession, was run by a group that explicitly called for a revolution Jan. 6. Facebook finally shut down the group Wednesday afternoon, after Buzzfeed reporter Ryan Mac brought the group to light on Twitter.

Advertisement

“It was only a matter of time before extremism cultivated online made the leap into the real world,” McNamee said.

Online organizing has been used in the past to plan right-wing violence in Michigan and Wisconsin, and McNamee noted that social media were used to plan violent counter-protests during last summer’s wave of protests against racist policing in cities such as Minneapolis; Louisville, Ky., and Portland, Ore.

Angelo Carusone, president of the nonprofit watchdog organization Media Matters for America, said Facebook has allowed groups circulating harmful lies to grow unchecked for years. “It’s created an enormous amount of damage,” he said.

The tech platform’s single biggest failure was allowing the spread of QAnon, the conspiracy theory that burst into the mainstream in early 2020, Carusone said. At one point during Wednesday’s occupation of the Capitol, the Arizona QAnon figure Jake Angeli was photographed standing behind the vice president’s desk at the head of the chamber, clad in his typical costume of face paint and animal furs.

“Everyone saw it, and they didn’t do anything to prevent their own algorithm and systems from growing these communities,” he said. “We are going to feel the effects of that for quite some time.”

Advertisement