Silicon Valley’s platforms are relieved to see Election Day slip into the past and feel they did a much better job than in 2016 at deflecting foreign meddling and disinformation, even as critics continue to point out new failures and President Trump’s refusal to concede has laid new challenges in their path.

Driving the news: With online polarization deepening after a close election, the CEOs of Facebook and Twitter will face hostile Senate questioning Tuesday from both sides of the aisle.

The big picture: Tech companies took unprecedented steps to curb the spread of electionmisinformation and are breathing a sigh of relief that they averted their biggest nightmare — a repeat of 2016’s foreign-disinformation debacle.

But a different test came in the days after the polls closed, as the companies faced constant judgment calls over posts from political leaders, most prominently Trump himself, claiming victory without basis and charging fraud without evidence.

Before and after Election Day, the platforms found themselves reacting, adapting and sometimes improvising new rules on the fly, despite their lengthy preparations.

Facebook
  • Shortly after Election Day, while votes were still being tallied, Facebook announced new steps, including the temporary demotion of posts containing election-related misinformation and limits on the distribution of some election-related live streams.
  • The company faced criticism for being slow to respond to groups and events proliferating on Facebook that suggested that the election was being “stolen” by Democrats through vote-by-mail fraud.
  • Along with Google, Facebook confirmed last week that it would extend its political advertising ban to cover a longer post-election period during which one side was challenging the results. That means these bans will remain in place as millions of dollars start flowing into two Georgia Senate runoffs in early January that will determine which party controls the Senate.
Twitter
  • Twitter, which was the first online platform to ban political ads, was widely seen as taking the toughest measures against misinformation.
  • The company said in an election post-mortem post that approximately 74% of the people who viewed problematic tweets saw them after it had applied a label or warning message.
  • It said it estimated a 29% decrease in shares via “quote tweets” of these labeled tweets, due in part to a warning prompt it placed on them.
YouTube
  • Google-owned YouTube allows “discussion of election results and the process of counting votes.” Under that policy, the online video giant left up videos from news channels like OANN that have pushed false stories on election fraud and postal ballots.
  • YouTube says its panels linking to Google’s election results, featured on the YouTube homepage, on videos and in searches, were shown “billions of times.”
  • “The most popular videos about the election are from authoritative news organizations,” said YouTube spokesperson Ivy Choi.
  • According to Choi, when users search for election-related content, 88% of the videos shown in their top-10 results in the U.S. come from “authoritative sources.”
Snapchat

Unlike other platforms, Snapchat did not have to rush out a number of new policies in the weeks before the election, because its policies already covered most of the cases that its rivals rushed to handle.

  • It also didn’t need to make any adjustments to its political ad policies, because the company fact-checks all of its ads.
  • Over 30 million Snapchat users used its voter engagement programs, which helped users register to vote or make a plan to vote, with over 1.2 million users registering to vote via these programs.
  • Snapchat, by design, doesn’t have the same type of publicly-facing viral news environment that Facebook and Twitter have, shielding it from some of its peers’ woes.
Tiktok

The short video-sharing platform has tried to remain apolitical, but it ended up struggling to contain election-related misinformation anyway.

  • Shortly after Election Day, TikTok took action against several election-related hashtags, like #RiggedElection, but left others up, TechCrunch reported. The platform also took down a number of popular videos touting unsupported election-fraud allegations after a report by Media Matters.
  • TikTok didn’t have election-related statistics to share, but spokesperson Jamie Favazza said TikTok would provide a recap of its election integrity efforts “in the future” and is still removing misleading information, videos and violence-inciting accounts.
Labeling

The platforms’ chief remedy for unsupported claims of election fraud was to append posts with labels pointing to authoritative information.

But, but, but: Misinformation experts say that labeling policies did not deter people from sharing knowingly false information.

  • Platforms applied labels to false information, including from the president, but in many cases did not apply more “friction” to prevent sharing, meaning labeled content still spread wildly, said Alex Stamos, former Facebook CISO and head of the Stanford Internet Observatory.
  • “This should not be happening,” Stamos said. “It’s the weakness in the protections the companies have in place. The label isn’t doing anything.”
  • A lack of aggressive down-ranking and limitations on re-shares will continue to be a significant issue, he said.

The bottom line: Under normal circumstances, election-related debate is an asset of democracy, and the last kind of content that a tech platform would block. The combination of a deeply polarized nation and a chief executive prone to tweeting falsehoods pushed these companies’ systems beyond anything they were built for.

  • Ten days after the election, misleading or deceptive new narratives continue to bubble up online, according to experts from the Election Integrity Project: misleading charts and statistics about voting, anecdotes about dead voters, false storylines about voting machine “glitches” affecting election outcomes.

What’s next: Twitter CEO Jack Dorsey and Facebook CEO Mark Zuckerberg will be testifying before the Senate Judiciary Committee on Tuesday.