Advertisement

Russian bots and trolls amplified both sides after Florida school shooting

Two girls crying after the viewing service for Gina Montalto, one of the victims of the shooting, in the Kraeer Funeral Home in Coral Springs. EPA/CRISTOBAL HERRERA

Russian-linked influence networks on Twitter boosted hashtags on both sides of the U.S. gun control debate in the aftermath of the Parkland, Fla., school shooting on Feb. 14, and have been doing so ever since, analysis shows.

Earlier this week, they also amplified a conspiracy theory that teenage survivor David Hogg was a “crisis actor” and part of an elaborate hoax.

On the day of the attack, they started promoting #floridashooting and #guncontrolnow as hashtags, and linking to a story that originally linked accused gunman Nikolas Cruz to a neo-Nazi organization in Florida.

By Feb. 16, they were also promoting #falseflag, a tag for arguing the shootings were faked, (at about the same frequency that they were promoting #gunreformnow, promoting gun control). On Feb. 18, as a gun-rights backlash against calls for gun control gained momentum, they started promoting #2adefenders, a gun rights tag.

Story continues below advertisement

By Friday, nearly two weeks after the shooting, they were promoting “Broward County” (where the shooting took place) and “Broward.” They also started promoting “Wayne LaPierre” (head of the National Rifle Association, who on Thursday launched an attack on gun control advocates, saying that they ” … hate the NRA, they hate the second amendment, they hate individual freedom“) and “LaPierre.”

They were also still promoting stories like this one, which questioned the authenticity of victims’ accounts of the shooting.

The data was gathered by Hamilton68, a project of the German Marshall Fund think-tank, which tracks about 600 troll and bot accounts it says are “linked to Russian influence operations.” Mostly, they tweet about Ukraine and Syria, but sometimes amplify the extreme right in the U.S.

It provides a real-time look at what messages Russian-linked accounts are promoting on any given day, and what debates they are trying to influence.

The pattern is several years old. The aim seems sometimes to amplify a particular side of a debate, sometimes to stoke fear and divisiveness more generally.

Hamilton68 also documented activity on Russian accounts around the mass shooting in Sutherland Springs, Texas, last November, and RT, formerly Russia Today, ran a fabricated story claiming the gunman belonged to Antifa.

Story continues below advertisement

In May of 2016, Russian propagandists used social media to orchestrate both a white-supremacist rally in Texas and also the protest against it, for a total cost of about US$200, CNN reported.

And in 2015, racially charged protests at the University of Missouri descended into panic as fake reports spread that the Klan was attacking demonstrators and had severely beaten a child.

The report, which was given credibility after the student body president retweeted it, was traced to a now-defunct Russian troll account, @FanFan1911. It was amplified by a combination of bots and real people who were taken in.

“The plot was smoothly executed and evaded the algorithms Twitter designed to catch bot tweeting,” U.S. Air Force Lt.-Col. Jarred Prier wrote in an analysis. “The narrative was set as the trend was hijacked, and the hoax was underway.”

“National news networks broke their coverage to get a local feed from camera crews roaming Columbia (Missouri) and the campus looking for signs of violence. As journalists continued to search for signs of Klan members, anchors read tweets describing shootings, stabbings and cross burnings.”

Shortly afterward, the account started tweeting in German, spreading rumours about Syrian refugees in Europe, Prier wrote. And early in 2016, it switched back to English to tweet about the U.S. election.

Twitter eventually suspended it.

Story continues below advertisement
WATCH: A Florida school shooting survivor is one of many slamming certain websites and fringe social media channels for calling him “fake” after a video purported him to be a “crisis actor,” but said the theories are helping raise attention to the calls for better safety at schools and gun control.
  • This week, YouTube removed videos published by Infowars’ Alex Jones accusing Parkland, Fla, school shooting survivor David Hogg of being a crisis actor. CNN reports that Jones is “two strikes” away from being banned from YouTube (which sounds to us like an open invitation to use up the second strike). But BuzzFeed reporter Charlie Warzel points out on Twitter that “Infowars has been doing this stuff on YouTube for years. To say it has just ‘one strike’ bc of a video that got a lot of attention is disingenuous … right now they’re playing some moderation logic contortion game that makes it seem like they’re doing something but really they’re not.”
  • A fake scare about contaminated turkey bought at Walmart timed for American Thanksgiving in 2015 has been traced to Russian online troll networks, the Wall Street Journal reports. Why would they bother? “Experts say it is as if the Russians were testing to see how much they could get Americans to believe.”
  • Wannabe Internet detectives named a Michigan man as the killer of a Virginia woman in Charlottesville, Va., in 2017. (He once owned the Dodge Challenger used in the attack, but sold it years before.) Now, he’s suing a number of people, including Gateway Pundit’s Jim Hoft.
  • In National Review, writer Kevin Williamson laments how far faked narratives of one sort or another are accepted in the mainstream right. “We should be ashamed of ourselves if we come to accept this kind of dishonesty in the service of political expediency,” he argues. “If conservative ideas cannot prevail in the marketplace of ideas without lies, they do not deserve to prevail at all.”
  •  In a Medium post, Jonathan Albright points out that Google’s autocomplete search suggestions are easily gamed, with often-toxic results. (Example: “the kkk is” suggests autocompletes of  “a christian organization,” “here to stay,” “christian,” and “awesome.”) “Why does this matter? It matters because this is information pollution at the most critical interface: search. Google is the knowledge portal for most of the world,” he writes.
  • Facebook vice-president Rob Goldman seems to have been trying to calm the waters over the platform’s role in serving as a medium for Russian trolls during the 2016 U.S. election through a tweetstorm last week. But he was promptly retweeted by U.S. President Donald Trump, and had to qualify some of his statements. “Mr. Goldman’s tweetstorm was unintentionally revealing,” Kevin Roose writes in the New York Times. “It showed that, years after hostile foreign actors first began using Facebook to wage an information war against the American public, some high-ranking officials within the company still don’t understand just how central Facebook was to Russia’s misinformation campaign, and how consequential the company’s mistakes have been.”
  • “The social media companies, including Facebook as well as Twitter, YouTube and Reddit, really do bear a part of the responsibility for the growing polarization and bitter partisanship in American life that the Russians, and not only the Russians, sought to exploit,” Anne Applebaum writes in the Washington Post. “The Facebook algorithm, by its very nature, is pushing Americans, and everybody else, into ever more partisan echo chambers — and people who read highly partisan material are much more likely to believe false stories.”
Advertisement

Sponsored content

AdChoices