Facebook says it may curb content on US Election Day
Facebook plans to restrict content to its users on November 3 under certain circumstances, according to a media report.
Facebook remarked that it may aggressively restrict content if the US presidential poll sparks any intense or violent unrest.
Nick Clegg, the Facebook’s head of global affairs, said that the company had plans already in place to handle a range of outcomes, including widespread civic unrest or the political predicaments of having in-person votes counted more rapidly than mail-in ballots, which will play a major role during this election because of the Covid-19 pandemic spread.
Facebook like all the other social networks has tried to block concerns about misinformation, and potential calls to violence around the presidential poll. The company early September had said that it will stop accepting political ads a week before Election Day, and promoting its own Voter Information Center with influential information on how to vote.
Nick Clegg said that Facebook already has in place some break-glass options available if any chaos or violent set of circumstances erupt.
Clegg didn’t discuss what those options were. However, he mentioned the company’s past use of measures to significantly restrict the circulation of content, deployed in countries with civic instability. A source revealed that the company had displayed 70 election outcomes and ways and means to respond to them, relying on staff completely.
Facebook did not delve into any detail of its plans for election-related content control, as malicious participants might just use that information to proactively work out how to `play the system.
Clegg referred to periods of unrest in parts of Sri Lanka and Myanmar when the company took unprecedented action to curtail the spread of misinformation on its platform.
During the unrest period in Sri Lanka and Myanmar, Facebook took actions like reducing the reach of content shared by any rule-breakers and restraining the distribution of borderline content that was sensationalist but did not breach any hate speech rules.
Clegg added, “We have acted aggressively in parts of the world where there is real civic instability and we are equipped to do that again.”
Facebook said that it will also place an informational label on posts that cast doubt on the election’s outcome or prematurely declare victory. This issue could crop up if voting takes place via email due to the ongoing pandemic, also because President Trump has made baseless claims that mail-in votes are fraudulent.
The proposed actions, which would probably go further than any action taken previously by the platform, come as the social media group is under immense pressure to lay out how it plans to combat election-related misinformation, voter suppression, and the incitement of violence on November 3 the election day and during the post-elections.
It also comes as concerns arise, that even US President Trump himself could take on to the social media to contest the result or call for violent protest, triggering any constitutional crisis.
Other than fighting a rising tide of misinformation from foreign and domestic operatives, experts warn Facebook must prevent the platform from being used to foment violence during the polls. There are 42 days left until the US presidential election is held on November 3.
However, Facebook has often failed to restrict content that promotes misinformation or violence. A recent media report revealed that QAnon conspiracy theorists thrived on the platform despite attempts at a crackdown. Its focus on Myanmar came after the military officials used Facebook to provoke genocidal violence against the Rohingya minority group. CEO of the company Mark Zuckerberg later called this an operational mistake.
Figures reveal that the company has helped 2.5 million users to register to vote in the upcoming US presidential election through Facebook, Instagram, and Messenger. Facebook reported it had 2.7 billion monthly active users worldwide at the end of June this year.