A quick look at both fake stories shows the value of reverse image search services like Google Image Search and Tineye. The photo in the February story shows men burning something that might or might not be an American flag in Cairo in 2012, while the May version shows a protest involving a mock burial of former U.S. president Barack Obama in Bangladesh in 2012.
It’s worth taking a moment to compare the real photo (above) to the cropped version on the New York Evening, a fake news site:
Whoever repurposed the image saw fit to crop ‘COFIN OF OBAMA’ out of the photo, something that, left in, might have at least confused many of their readers’ sympathies.
Fake news purveyors seem to have a greater appetite for inflammatory acts committed by Muslims than the real world can actually supply.
A London street party to celebrate a Pakistani cricket win in 2009, for example, was cynically repurposed and misrepresented as Muslims celebrating April’s terrorist attack in Paris.
In fake news news:
- The British election ended in a way that can’t have pleased anybody, except possibly the Ulster Unionists. In the last week or so of the campaign, we saw several detailed takes on how the parties used campaign advertising on social media:
- Marketing Week has a perceptive breakdown of Conservative Facebook advertising and the way it was microtargeted by constituency and cultural profile of the voter (though you have to read past “My money is on the Conservatives to win this week’s general election.”) This kind of digital marketing, Mark Ritson points out, operates in ways that are opaque to opponents, or the media.
- At the BBC, Mark Ellison tries to trace how digital advertising worked in the campaigns in Scotland, while cautioning that ” … ads are extremely difficult to monitor and regulate because they appear on an individual’s news feed rather than on public pages. Ultimately, only Facebook knows who is being shown what.” Ellison talks to developer Louis Knight-Webb, who was trying to trace the parties’ digital footprints by crowdsourcing reports from Facebook users. It’s cumbersome, but there’s no other approach.
- And Buzzfeed U.K. has a roundup of disinformation, misinformation and general deceit during the campaign, which included mocked-up front pages of non-existent newspapers (apparently this is a thing in British elections.)
- This week, Qatar’s state news agency broadcast and published online reports that the country’s leader had made comments supportive of Iran. It poured gasoline on already difficult relations between the Gulf state and its neighbours, but the reports were an ambitious hoax that involved hacking the agency itself — social media feeds and the news ticker in the TV broadcast itself. (“The TV station affected had terrible security in place,” Motherboard reports.)
- As NATO garrisons become more established in the Baltic states, there is uneasiness about disinformation from next-door Russia. The Center for European Policy Analysis looks at propaganda linked to an 18-day Estonian military exercise in May, while stopfake.org analyzes propaganda aimed at ethnic Russians in Estonia.
- Rebel Media’s Ezra Levant will have to pay an $80,000 defamation award to a Saskatchewan man after the Supreme Court refused to hear his appeal this week. Canadian Lawyer has the background of the case here.
- Connect Safely, a resource site for teens, educators and parents about digital life, offers a sensible guide to educating young people about fake news. Teens are asked to step back and analyze how their emotions are being manipulated by a given narrative, good advice at any age.
- A lawyer for a Kansas man charged in an alleged plot to massacre Somalis who came there to work in a meatpacking plant is offering a defence based on fake news. “Robinson tried to put the government’s case in the context of his client’s concerns about the state of the country: fears of an invasion across the Mexican border by troops from China or Cuba, a belief United Nations tanks would roll into western Kansas, and the lies allegedly told by a government informant that local Somalis were financing violence in the Middle East,” the AP reports.
- In the New York Times,
- Buzzfeed’s Charlie Warzel takes a different view, dismissing what he calls “The Great Bot Crisis.” Twitter does have a problem with fake news and propaganda, he says, but it’s mostly spread by live humans: ” … real people, not bots, who understand the power they wield to gin up outrage, frame conversations across the Internet and mainstream media, and attempt to win the battle of the narrative. And the network supporting these voices is boundlessly enthusiastic and relentless in its output.”
- The Nieman Lab has a weekly roundup of fake news news. This week: British and French voters seem less likely to share fake news than American ones, an Oxford University study shows.
- The emergence of fake news was followed by the emergence of fact-checking organizations. Maybe inevitably, fake fact-checkers have started to appear. Poynter looks at FactCheckArmenia.com, a site affiliated with the Turkish government which is still re-litigating the Armenian genocide which began in 1915. Poynter traces ” … trails that connect the Fact Check Armenia project directly to the Ankara government.”
- Techcrunch looks at the self-reinforcing loop that Facebook’s algorithm creates for users. “Facebook has become a feedback loop which can and does, despite its best intentions, become a vicious spiral,” Jon Evans writes. “At Facebook’s scale, behavioral targeting doesn’t just reflect our behavior, it actually influences it.”
- The Lawfare Blog reflects on the Citizen Lab report on Russian hacking that came out in May, in which researchers looked at a case where real documents were stolen from a reporter and altered before publication, a practice they called ‘leak tainting.’ “For civil society to connect the people with their government, (NGOs) must have and maintain the people’s trust, and that’s why the Russians hacking and subsequent tampering of the stolen data is so threatening,” Susan Landau writes. “Do the same to Sierra Club or the League of Women Voters — steal email and unpublished reports and publish modified, falsified versions — and the trust that citizens give to those organizations dissipates.”