Advertisement

Facebook accused of using ‘sentiment analysis’ to target kids feeling ‘stressed’ or ‘anxious’

The social media giant has come under fire for a study that reportedly monitored teens' emotions. Iain Masterton

Facebook has been accused of tracking users’ emotions to share with advertisers, according to an internal document that was leaked to The Australian newspaper on Monday.

The social media giant allegedly used “sentiment analysis” to enable curated advertising practices addressing kids as young as 14 when they’re feeling “stressed,” “defeated,” “anxious,” “overwhelmed,” “nervous,” “stupid,” “silly,” “useless” and “a failure.”

READ MORE: Selfies are putting young women at risk for depression, anxiety

According to the article, by monitoring posts, comments and interactions, the social media giant is able to determine users’ emotional state and, some say, that information could be shared with advertisers who could then tailor their ads to prey on their vulnerabilities.

This particular study focused on teens and young adults in Australia and New Zealand. The information culled included relationship status, location, number of friends and the amount of times a user logged on (although this is something the company does regularly and sells to advertisers.)

Story continues below advertisement

In this instance, The Australian said that Facebook was able to gather information on people discussing “looking good and body confidence” and “working out and losing weight.” The article also stated that the social media giant could determine how emotions were communicated throughout the week.

“Anticipatory emotions are more likely to be expressed early in the week, while reflective emotions increase on the weekend,” the document said. “Monday to Thursday is about building confidence; the weekend is for broadcasting achievements.”

WATCH BELOW: Study finds many teens taking breaks from social media

Click to play video: 'Study finds many teens taking breaks from social media'
Study finds many teens taking breaks from social media

Facebook has responded to the allegations with a statement: “The analysis done by an Australian researcher was intended to help marketers understand how people express themselves on Facebook. It was never used to target ads and was based on data that was anonymous and aggregated.”

Story continues below advertisement

According to a Facebook Canada representative who spoke to Global News, the study in question, which was conducted a year ago, didn’t follow the social media company’s protocol for gathering user insight.

The top motivating factor for research is to maximize the benefits of the user experience and mitigate the downsides, the company states. Furthermore, it claims to not offer advertisers the ability to target its users based on their emotional state.

While the practice of reviewing online user information reeks of Big Brother, it’s something that has been going on for some time, says Lisa Montenegro, president and founder of Digital Marketing Experts in Toronto.

“The reality is, when you talk about Facebook and Google, they have information on us already,” she says. “And ad targeting can be positive or negative.”

She says it’s not surprising that Facebook might veer in this direction because the social media company has been losing the coveted teen demographic to new platforms, like Snapchat. However, this practice always comes down to responsibility.

“If Facebook knows that teens are experiencing low self-esteem and flood their feeds with ads for cosmetics or diet products, that’s dangerous. But if they put up ads for the Kids Help Phone or counselling services, that would be positive,” she says.

READ MORE: Facebook hiring 3,000 workers to catch and remove streaming violence

In actual fact, this is no different than the ads that run during TV commercial breaks — after all, that’s how soap operas got their name — this just taking a more sophisticated step forward, Montenegro says.

Story continues below advertisement

“Facebook has to approve ads before they’re shown and they typically won’t approve one that’s considered dangerous. They have a responsibility and they know it,” she says.

This isn’t the first time Facebook has come under fire for audience research projects. In 2012 the company conducted a week-long experiment to see if they could alter users’ emotions by putting positive and negative content on their homepage. They concluded that the content had the ability to alter users’ feelings through “emotional contagion.”

Facebook apologized and consequently changed their research guidelines.

Sponsored content

AdChoices