FaceApp pulls ‘ethnicity’ filter after online backlash

On Wednesday, FaceApp released a new filter that allowed users to change their ethnicity. Hours later they removed the filter after massive criticism. Alex Nichols @Lowenaffchen/Twitter

Hours after releasing a new filter that allows users to change their race, FaceApp pulled the feature after massive criticism.

The smartphone app uses artificial intelligence to modify selfies in a number of ways to alter the appearance of subjects, like making them look younger or older — or even as though they belong to the opposite sex.

On Wednesday, the company released a new filter that allowed users to modify a picture to fit one of four categories: Caucasian, Asian, Indian or Black.

READ MORE: FaceApp apologizes after backlash over ‘racist’ skin lightening filter

This involved lightening and darkening a user’s skin as well as altering facial features and hair texture.

The company initially released a statement arguing that the “ethnicity change filters” were “designed to be equal in all aspects,” according to The Guardian.

Story continues below advertisement

WATCH: Is this Snapchat filter racist?

Click to play video: 'Is this Snapchat filter racist?'
Is this Snapchat filter racist?

“They don’t have any positive or negative connotations associated with them,” the company’s CEO Yaroslav Goncharov said. “They are even represented by the same icon. In addition to that, the list of those filters is shuffled for every photo, so each user sees them in a different order.”

Some people quickly took to Twitter to criticize the filter, calling it racist.

Goncharov responded saying “the new controversial filters will be removed in the next few hours.” Less than 24 hours after launching the app, FaceApp removed the filter and it’s no longer available for users.

Story continues below advertisement

READ MORE: Yik Yak App raising concerns at high schools

In April, the company also came under fire for a  “hot” filter automatically lightened people’s skin. Goncharov also later apologized for the feature, which he said was a side-effect of the “neural network”.

Sponsored content