Advertisement

Facebook taking more steps to target misinformation ahead of U.S. election

Click to play video: '‘Anarchists, rioters’ on plane: Trump echoes months-old Facebook conspiracy theory'
‘Anarchists, rioters’ on plane: Trump echoes months-old Facebook conspiracy theory
WATCH: : Trump echoes months-old Facebook conspiracy theory – Sep 1, 2020

With just two months left until the U.S. presidential election, Facebook says it is taking more steps to encourage voting, minimize misinformation and reduce the likelihood of post-election “civil unrest.”

The company said Thursday it will restrict new political ads in the week before the election and remove posts that convey misinformation about COVID-19 and voting. It also will attach links to official results to posts from candidates and campaigns declaring premature victories.

“This election is not going to be business as usual. We all have a responsibility to protect our democracy,” Facebook CEO Mark Zuckerberg said in a post on Thursday. “That means helping people register and vote, clearing up confusion about how this election will work, and taking steps to reduce the chances of violence and unrest.”

Story continues below advertisement

Facebook and other social media companies are being scrutinized over how they handle misinformation, given issues with President Donald Trump and other candidates posting false information and Russia’s interference in the 2016 White House elections and ongoing attempts to interfere in U.S. politics.

Facebook has long been criticized for not fact-checking political ads or limiting how they can be targeted at small groups of people.

For news impacting Canada and around the world, sign up for breaking news alerts delivered directly to you when they happen.

Get breaking National news

For news impacting Canada and around the world, sign up for breaking news alerts delivered directly to you when they happen.
By providing your email address, you have read and agree to Global News' Terms and Conditions and Privacy Policy.

With the nation divided, and election results potentially taking days or weeks to be finalized, there could be an “increased risk of civil unrest across the country,” Zuckerberg said.

Click to play video: 'U.S. congressman challenges Zuckerberg over response to Facebook’s handling of misinformation'
U.S. congressman challenges Zuckerberg over response to Facebook’s handling of misinformation

In July, Trump refused to publicly commit to accepting the results of the upcoming election, as he scoffed at polls that showed him lagging behind rival candidate Joe Biden. That has raised concern over the willingness of Trump and his supporters to abide by election results.

Story continues below advertisement

Under the new measures, Facebook says it will prohibit politicians and campaigns from running new election ads in the week before the election. However, they can still run existing ads and change how they are targeted.

Posts with obvious misinformation on voting policies and the coronavirus pandemic will also be removed. Users can only forward articles to a maximum of five others on Messenger, Facebook’s messaging app. The company also will work with Reuters to provide official election results and make the information available both on its platform and with push notifications.

Click to play video: 'Zuckerberg pushes back against accusations Facebook’s purchase of Instagram was illegal'
Zuckerberg pushes back against accusations Facebook’s purchase of Instagram was illegal

After getting caught off-guard by Russia’s efforts to interfere in the 2016 election, Facebook, Google, Twitter and others companies put safeguards in place to prevent it from happening again. That includes taking down posts, groups and accounts that engage in “co-ordinated inauthentic behaviour” and strengthening verification procedures for political ads. Last year, Twitter banned political ads altogether.

Story continues below advertisement

Zuckerberg said Facebook had removed more than 100 networks worldwide engaging in such interference over the last few years.

“Just this week, we took down a network of 13 accounts and two pages that were trying to mislead Americans and amplify division,” he said.

But experts and Facebook’s own employees say the measures are not enough to stop the spread of misinformation — including from politicians and in the form of edited videos.

Facebook had previously drawn criticism for its ads policy that cited freedom of expression as the reason for letting politicians like Trump post false information about voting.

Sponsored content

AdChoices