Such a united front wouldn’t just be symbolic. “If you had the platforms together making a statement of their values, then when they take action, it creates a permission structure for reticent platform executives to make difficult decisions quickly,” David Kaye, former United Nations special rapporteur on freedom of opinion and expression, told the editorial board. Such a move would also be a strong public signal of the gravity of the moment.
There’s precedent for this type of collaboration. In 2016, Facebook, Twitter, Google and Microsoft came together to combat extremist content. The companies created a shared database using unique digital fingerprints to flag videos, pictures and memes promoting terrorist activity and ideologies. Domestic political disinformation poses different challenges than terrorist threats, but both are urgent matters of national security.
A public, transparent effort from the platforms would offer additional accountability for those spreading disinformation in the weeks and months after the election. “Be very clear and publish a database the public can access,” Mr. Kaye urged the platforms. “Say, ‘These are the accounts we took action against. Here’s why.’ It doesn’t need to be a legal opinion, just a list. If there are privacy concerns, they can redact names.”
There are very few tools now for parsing how messages spread across social media. Three days after the 2016 election Facebook purchased the best one, a tool called Crowdtangle, which tracks online engagements with social media posts. It is the best available method to understand what is popular on the platform, though Facebook argues that tracking engagement is not a reliable indicator for how many people saw a post. At this pivotal moment for American democracy, Facebook owes it to the American public to provide metrics to evaluate that claim.
Facebook, Twitter and Google will most likely argue that they’re doing plenty of this work behind the scenes. In a recent interview, Nick Clegg, Facebook’s head of global affairs, said the company was war-gaming election night scenarios. “There are some break-glass options available to us if there really is an extremely chaotic and, worse still, violent set of circumstances,” he said. But Mr. Clegg stopped short of offering specifics.
Such vagaries are worrisome, especially since Mr. Clegg admitted that “any high-stakes decisions will fall to a team of top executives” like Mr. Clegg and Facebook’s chief operating officer, Sheryl Sandberg, with the chief executive, Mark Zuckerberg, “holding the right to overrule positions.”
These platforms have consolidated power to control the flow of information to billions of people. The power to judge which content is harmful to democracy on election night rests with a handful of tech executives. That Mr. Zuckerberg, the ultimate arbiter at Facebook, is accountable to no one, including his company’s board, is even more alarming.