Menu

Topics

Connect

Comments

Want to discuss? Please read our Commenting Policy first.

Russian spokesperson’s TV comment referenced 2007 Photoshop fake as real

swarmandal.com, FreakingNews.com

“Speaking of lobbying,” Russian Foreign Ministry spokeswoman Maria Zakharova began.

Story continues below advertisement

(They had been speaking about lobbying, more or less; Zakharova was appearing earlier this month on a Moscow-based TV network linked to the regime, and was talking about charges against two people linked to Donald Trump’s 2016 campaign for, among other things, failing to register as foreign agents.)

Speaking of lobbying, recall the fantastic and eye-popping photographs when they received bin Laden in the White House. This is classical lobbying in the very literal sense of the word.”

Well, no, actually, we don’t, for the good reason that it never happened.

Zakharova was apparently referencing a decade-old Photoshop fake appearing to show Osama bin Laden shaking hands with Hillary Clinton (the “at the White House” part comes later).

The Photoshopped image had come eighth in a Hillary Clinton Photoshopping competition back in 2007 on freakingnews.com, which can only be described as a site that goes in for Photoshopping competitions. At the time, Clinton was running for the Democratic presidential nomination against an ambitious young senator named Barack Obama.

Story continues below advertisement

The anonymous person who put it together used two images: a photo of the Indian-American musician Shubhashish Mukherjee meeting Clinton in 2004 …

pretty obviously in a hotel.

The other element was an undated image of Osama bin Laden.

The final result is below. Compare details of the turban and beard:

The doctored image had languished in obscurity for a decade or so until Russian Twitter accounts picked it up last year. (Here’s a screenshot the BBC saved.)

h/t the BBC, the Washington Post and Snopes.

Story continues below advertisement

(Snopes points out that Condoleeza Rice was the victim of a similar hoax involving a different photo.)

In fake news news:

  • The @TEN_GOP Twitter account (which purported to be a Republican account from Tennessee) was a “heavyweight voice on the American far right,” the Digital Forensics Research Lab says – when Twitter pulled the plug back in July, it had over 130,000 followers. It wasn’t run from Tennessee, but from the Internet Research Agency, thousands of miles away in St. Petersburg, Russia. @TEN_GOP was much more successful at passing itself off as actually American than some other similar efforts, and was reweteeted by several people in Trump’s inner circle, which seemed to polish its credibility. DFRLab looks at its strategy in detail.
  • The pro-Brexit Twitter account @DavidJo52951945, aka “David Jones,” accumulated 102,000 followers before it was set to private following a story in the Times asserting that “hundreds of thousands of Twitter users are reading pro-Kremlin propaganda by a suspected Russian troll posing as a Ukip supporter.” The Times story is paywalled, but you can read the Independent‘s version, which points out that despite an enormous social media presence, it’s impossible to track down “David Jones” in person. Tweets followed the hours of the working day in Russia, and ” … the intensity of messaging has coincided with key points of Russian government interest.”
  • Why is milk good for you? Google’s featured snippets will explain. Why is milk bad for you? Google’s featured snippets will explain that, too. Is Barack Obama Muslim? Absolutely. Google’s featured snippets have the details. “Because Google’s algorithm seeks answers that closely match users’ questions, its responses often reflect how a question is framed,” the Wall Street Journal explains. “That can lead to different answers to similar questions, and contribute to confirming biases.” In other words, the algorithm is returning answers that are set up to be satisfying, rather than correct.
  • ProPublica reporter Julia Angwin explains what it was like to be targeted by an ’email bomb,’ easily launched for minimal cost to the attacker and disruptive for her. “In recent years … a depressing realization has taken hold: The internet is fragile and easily exploited by hackers, trolls, criminals, creepy corporations and oppressive governments,” she writes.
  • In Foreign Policy: a look at the massive, sudden and revolutionary way that Facebook has changed politics in many Asian countries. The story suggests, not for the first time, that hate propaganda spread on Facebook played an important role in fueling the Rohingya genocide in Burma. “’Move fast and break things’ was Facebook’s mantra for developers until 2014, signaling the twin totems of speed and aggression that animate many programmers and venture capitalists in the U.S. tech industry. Yet it’s a lot less appealing when the things being broken are people.”
  • Russia seems to have tried to influence the outcome of the 2016 U.S. election through social media, but could they have? Precisely targeted in just the right places, it’s reasonably possible, University of Pennsylvania professor Kathleen Hall Jamieson argues. “In 2016, the extent and virality of Russian Web content was great enough to plausibly affect the outcome of an election decided in three states by about 78,000 votes.”
  • Facebook, Twitter and Google often come under scrutiny, but we shouldn’t ignore Instagram, Johnathan Albright argues in a Medium post: ” … Instagram — a service larger than Twitter and Snapchat combined — should be seen as a major influence, targeting and engagement hub for the spread of political propaganda.”
  • In the Guardian, Claire Wardle and Hossein Derakhshan tease out one way fake news spreads: the need for engagement and social connection. “The act of sharing is often about signalling to others that we agree with the sentiment of the message, or that even if we don’t agree, we recognise it as important and worth paying attention to. We want to feel connected to others, and these mini-performances allow us to do that.” They also question ‘fake news’ as a term, because 1) it’s been weaponized as a response to any unwelcome message (not a great argument, since that could happen to any other term) and 2) it’s not precise enough to describe what we’re trying to understand very well (which is true).
  • The potentially enormous political forces that social media has set in motion have only been widely discussed for the last year and a half or so. The Times (the other one) talks to Renee DiResta, a tech entrepreneur who has been studying them for a lot longer than that. “I think we are at this real moment, where as a society we are asking how much responsibility these companies have toward ensuring that their platforms aren’t being gamed, and that we, as their users, aren’t being pushed toward disinformation,” she reflects.
Story continues below advertisement

 

 

 

Advertisement

You are viewing an Accelerated Mobile Webpage.

View Original Article