Advertisement

AI and sexual violence online: Why advocates are concerned amid N.B. rise in sextortion cases

Click to play video: 'Concerns about use of AI in sexualized violence'
Concerns about use of AI in sexualized violence
WATCH: As the use of artificial intelligence reaches mainstream audiences and becomes increasingly accessible to the public, concerns around its misuse – particularly in causing sexualized violence online – are mounting. As Megan King reports, New Brunswick RCMP are warning the public of a rise in sextortion cases, not only in the province but nationwide. – Feb 9, 2024

As the use of artificial intelligence reaches mainstream audiences and becomes increasingly accessible to the public, concern is growing about its misuse.

Of particular concern, is how AI can contribute to sexualized violence online.

“Young people don’t get opportunities to talk about it, but these types of things are happening to them,” said Kaitlynn Mendes, Canada Research Chair in Inequality and Gender.

Having an online presence is increasingly a part of social culture, especially for young people. Intimate image sharing is becoming more common, and some may take advantage of that level of access.

“When someone extorts you, grooms you, either threatens or actually shares your intimate images without your consent, that’s a violation of those really important rights,” said Mendes, speaking from London, Ont. “We want young people to know that they shouldn’t feel ashamed or blamed because of what happened to them.”

Story continues below advertisement

New Brunswick RCMP are warning of a rise in sextortion across the province.

“It’s not just happening here,” says RCMP spokesperson Cpl. Hans Ouellette. “It’s happening across the country and possibly worldwide. Please contact us as soon as you feel that you have been a victim of this type of crime. There are ways that we can take down these pictures from the internet, and there’s also ways for us to track these individuals — even across the world.”

Click to play video: 'RCMP urging parents to talk to teens about sharing private images online'
RCMP urging parents to talk to teens about sharing private images online

Meanwhile, intimate image laws are playing catch up. Some provinces have recently made changes to better include “altered images” when defining civil statutes.

Breaking news from Canada and around the world sent to your email, as it happens.

Dalhousie University law professor, Suzie Dunn, points to British Columbia as an example.

“They have this civil tribunal where you could just go online and you could just say that, ‘There’s nude images that have been released about me, I’d like to get a court order to get it taken down,'” Dunn explained. “You submit the information, it’s inexpensive, it’s easy. You get a court order and then you can go to the social media companies or the pornography companies and say, ‘Hey, I’ve got a court order, you need to take this down.'”

Story continues below advertisement

Dunn said it’s also important for governments to create accessible resources for getting images taken down.

“Law is only one tool in the toolbox. But, ideally, we would just get people to know that, if you get an intimate image, keep it to yourself. If you see it, don’t spread it and hopefully tell other people to do the same thing,” said Dunn.

At the same time, advocates are reminding parents to speak to their children and inform them of their rights, while also encouraging them to ask for help.

But too often, says Andrea Gunraj from the Canadian Women’s Foundation, that responsibility falls on women.

“We have to stop putting the responsibility on the user and the person who is at high-risk — often women, often gender-diverse people, particularly young people, racialized women, Indigenous women,” she said. “We have to instead look at making our systems better, our user experience better and putting more responsibility on tech companies themselves.”

Click to play video: 'Taylor Swift deepfake images: Why people are concerned over pornographic AI photos'
Taylor Swift deepfake images: Why people are concerned over pornographic AI photos

— with a file from Global News’ Rebecca Lau 

Advertisement

Sponsored content

AdChoices