Following the largest Immigration and Customs Enforcement (ICE) raid in over a decade last week, dozens of people were arrested at an Amazon Books store in Manhattan, N.Y., on Monday morning for protesting the business relationship between the e-commerce giant and the immigration authority.
As ICE raids have ramped up over the past few months, the Seattle-based tech giant has come under fire for its history of selling cloud services technology to ICE through a third-party organization called Palantir, which helps the immigration authority track and target immigrants.
A number of contracts between major tech companies and law enforcement have come to light over the past few years — including Amazon’s facial-recognition surveillance program, known as Rekognition, and Google’s AI military drone project — all which have been received skeptically by consumers.
Protesters occupied an Amazon Books location in Manhattan specifically to protest the company’s relationship with ICE, and employees have called on the company to terminate these contracts.
An investigation commissioned by the activist organizations Mijente, the National Immigration Project and the Immigrant Defense Project last year found that Amazon has played a central role in providing infrastructure to ICE and the U.S. Department of Homeland Security’s programs.
Regulation not strong enough to prevent data misuse
According to Jake Laperruque, senior counsel with the Project on Government Oversight, a non-partisan independent watchdog, there is no clear regulation compelling companies with both consumer-facing and government-facing business interests to keep these markets separate.
“Individual users could be drafted, effectively, into some sort of law enforcement service by having their use of one product feed the use of other products that are going to those services,” he said.
Michael Bryant, executive director of the Canadian Civil Liberties Association, says companies like Amazon owe a duty of disclosure to their massive consumer bases.
“Amazon is trying to play both sides here, and they are cultivating this clientele, knowing full well consumers aren’t going to want to be sharing a parking lot with a bunch of immigration enforcers,” he told Global News.
WATCH: Facebook hit with massive $5 billion fine by the FTC
In order to encourage transparency around this issue, Laperruque suggests governments should be proactive in drafting clear disclosure rules.
“We need strict rules about transparency,” he explained. “It’s extremely frustrating when companies — Amazon being, notably, one of them — seem to design contracts and create secret requirements that stretch beyond their legitimate powers and disrupt the public record.”
In addition, both Laperruque and Bryant agree that major companies with massive civilian customer bases need to restrict the contracts they hold with law enforcement agencies — or have the breadth of these contracts restricted by law.
“I think that the big technology companies are going to have to temper their greed and make choices about who they’re doing business with,” Bryant said. “Do they want to be surveillance capitalist leaders to the detriment of consumers?”
Potential for product overlap
In addition to the lack of transparency around the use of data, Laperruque suggests that in the absence of regulation, there’s a risk of companies like Amazon creating consumer products that dually serve the interests of law enforcement.
He gives the example of the company working with police departments in the distribution of its Ring video doorbell. According to a number of reports from Motherboard and Wired, Amazon is working with police to not only convince people to buy the device but also to sign up for its neighbourhood watch app. As a result, police would reportedly get access to footage from the Ring video doorbell.
Police will then have the option to request the footage from users of the watch app, Motherboard reported.
“These kinds of situations are when you start to get overlap in terms of the interest,” he explained.
Laperruque expresses concern that footage from the app could be used for other purposes and gives the example of identifying individuals on police wanted lists through this footage.
“Is there potentially a conflict of interest in terms of who the company is trying to serve? I think those problems are just going to augment when you start to actually do things like integrate facial recognition into ring doorbells or use face recognition system automatically call the police,” he added.
Neither of these proposals has been put forth by Amazon or police groups, but it’s something that has human rights experts concerned, he said. Last year, however, Amazon came under fire for its Rekognition program, which sells facial-recognition technology to police departments.
The American Civil Liberties Union (ACLU) accused the e-commerce giant of selling “authoritarian surveillance” technology and demanded the partnership be terminated.
WATCH: Amazon staff can listen to private conversations through Alexa
“That gives the government a really far reach into people’s lives,” Shankar Narayan, director of the Technology and Liberty Project with the ACLU, previously told Global News. “We already know about the challenges that facial-recognition technologies have.”
Beyond privacy and human rights violations, Laperruque also notes that documented flaws with Amazon’s facial-recognition technology may result in police misidentifying individuals.
This past April, at least 25 prominent artificial intelligence researchers, including experts at Google, Facebook and Microsoft, signed a letter calling on Amazon to stop selling its facial-recognition technology to law enforcement because of its biases against women of colour.
What can consumers do to protect themselves?
While experts and advocacy groups continue to call on regulators to draft transparency laws, consumers can take actions to prevent their data from becoming caught up in one of these side projects.
Laperruque explained that most of the products we use in our day-to-day lives come with opt-out rules or offer ways to delete data. Amazon, for example, allows consumers to opt out of “human review,” which refers to voice recordings being stored by Alexa to improve its AI function.
Apple and Google have either paused or allowed consumers to opt out of the human review function in different capacities around the world.
In addition, Bryant encourages consumers to “vote with their feet” by boycotting Amazon’s services “if they don’t like the connection between their cloud storage and the owner’s values.”