The free-to-download app allows users to artificially age themselves in photos. To do so, the app uploads your photo to a remote server that uses artificial intelligence to predict your future appearance.
However, the app comes with a few potentially nasty surprises, including a user agreement that gives the Russia-based developer, Wireless Labs, control over everything you upload to the app forever.
“You grant FaceApp a perpetual, irrevocable, non-exclusive, royalty-free, worldwide, fully paid, transferable sub-licensable licence to use, reproduce, modify, adapt, publish, translate, create derivative works from, distribute, publicly perform and display your user content and any name, username or likeness provided in connection with your user content in all media formats and channels now known or later developed, without compensation to you,” the app’s terms of service say.
The developer also says it may continue to store your uploaded content even after you’ve deleted it.
FaceApp founder Yaroslav Goncharov says his company doesn’t sell or share user data with any third parties.
“We might store an uploaded photo in the cloud. The main reason for that is performance and traffic: we want to make sure that the user doesn’t upload the photo repeatedly for every edit operation,” Goncharov said in a statement to People.
“Most images are deleted from our servers within 48 hours from the upload date.”
However, Rozita Dara, assistant professor at the school of computer science at the University of Guelph, believes there’s still a chance your data is accessed, stored and used at a later date.
Dara is worried the developer may not be telling the truth about the extent to which data is collected from users — it could go well beyond the one image you upload to the app — and, as Dara said: “digital data collected is forever.”
WATCH: Why StatCan wants Canadians’ personal banking information
“Your interests, your political views, your personal views… you never know where your data will end up in five or 10 years from now.”
According to Dara, the possibilities are endless.
“It’s very hard to know which partners they’re working with and for what purpose,” she said.
This information could be stored and connected to a photo of you, along with your name and other pertinent information, which can leave you vulnerable to identity theft — a threat that concerns Dara.
“As an individual, you want to have power,” Dara said. “When somebody has lots of information about you… your power is limited.”
She’s also worried that updates to the app could change how the app works and what else it has access to on your phone.
FaceApp wouldn’t be the first company to “misuse” data. For example, Facebook collected data through a personality test in 2016, which it then provided to Cambridge Analytica, a data firm hired by the Trump campaign.
How it works
With data to inform them, algorithms can become a fast and effective way to reach large portions of the population.
“Data is basically a gem in the hands of anyone who wants to train their machine-learning algorithm,” said Periklis Andritsos, assistant professor in the faculty of information at the University of Toronto.
Algorithms need to learn what to look for, and they do so through data consumption.
WATCH: Rise of smart cities sparks privacy concerns
“It’s the same way that we learn,” said Andritsos. “We go back in the past. We read text, we understand concepts and then, by having these models in our heads, we apply whatever we know to new pieces of information. Whenever we see something that is blue, we know it’s blue because we’ve seen blue before.”
In the context of FaceApp, Andritsos is most concerned about the “visual aspect” of it.
Technically, the developer of FaceApp could use the data the app is collecting to create other, less secure programs. The developer could then sell this programming to other third-party partners or use it for something even more malicious.
“The sky’s the limit,” said Andritsos.
Common ways our data is used
This technology can be used in a variety of ways — the most common of which is to “teach” artificial intelligence.
“They may collect data to give you back the service, something personalized for you,” Dara said. She uses the example of online advertising, which often uses your search history to serve you ads for products you’re already looking for.
It’s also common for large companies like Facebook simply to use your data to profile you as a consumer. Profiling is the process of researching and understanding more about specific age groups and their interests and preferences.
Less common — but not unheard of — is when a developer or data collection agency sells your information to a third-party partner. This would be a company you have never interacted with so you don’t know their policy for how the data is used.
Ann Cavoukian, former information and privacy commissioner for the province of Ontario, warns that this could lead to identity theft or worse.
WATCH: Russia demands Tinder turn over personal user data
“Your facial image is your most sensitive biometric,” she said. Another example of a biometric is your fingertips, which are unique to you.
“Your face is extremely sensitive, and if it’s accessed by third parties without your knowledge or consent, it can cause havoc in your life,” Cavoukian said. “The most obvious thing is they can steal your face and engage in activities and charges that will be billed to you.”
Once your identity has been stolen, it’s nearly impossible to reverse the impact on your life.
“It could affect your credit score, you won’t be able to purchase real estate” and more, Cavoukian said.
Even if your data isn’t used directly against you, it could be used against other vulnerable groups around the world.
This is a major concern of David Shipley, CFO of Beauceron Security.
“Your data could be used to train AI technologies, which are then sold to countries like China and the Philippines,” Shipley said. “In turn, those technologies could be used to target vulnerable groups, such as LGBTQ people.”
WATCH: Democrats call for investigation into FaceApp over concerns of national security, privacy risks
In 2017, the Guardian reported that new AI technology was able to determine sexual orientation by analyzing an image of a person’s face.
“Like any new tool, if it gets into the wrong hands, it can be used for ill purposes,” Nick Rule, an associate professor of psychology at the University of Toronto, told the Guardian.
“If you can start profiling people based on their appearance then identifying them and doing horrible things to them, that’s really bad.”
Advice from a big data expert
First and foremost, Dara said users should “fully understand” what a new app — or even old ones like Facebook and Twitter — does before downloading it onto their device.
This, unfortunately, requires closely reading the terms and conditions of use, which can be lengthy and full of legal jargon.
However, once you know what the app is supposed to be doing, you’re better prepared to notice if and when the developer has started doing something nefarious.
And finally, never use the same password twice.
“Bank websites, service providers, social media apps… they should all be very strong and unique,” Dara added.
Cavoukian strongly advises against downloading FaceApp.
“Don’t use it. Just walk away from it,” she said.
— With files from Josh K. Elliott