Editor’s note: This article has had the number of images removed at the request of the school division, who did not have authorization to release the information.
A Winnipeg school division is reeling after students at a local high school reported explicitly altered photos of students were being shared online.
The Louis Riel School Division first received notice of what it said were doctored photos of an unknown number of students on Dec. 11, from College Beliveau. In a release on Thursday afternoon, the school division sent out a letter to parents updating them on the situation and the ongoing investigation.
The photos, according to the division, were sexually explicit in nature and doctored. An investigation was launched to “get a better understanding of the extent of what happened and what was involved.” It further added that original photos appeared to have been gathered from publicly accessible social media posts. How they were altered is unknown.
The division said that it does not know if it was made aware of all the photos out there. The images it did receive will be uploaded into Cybertip, a national reporting tipline that specializes in the remove of such online images. The tipline, operated by the Canadian Centre for Child Protection, will make use of the centre’s Project Arachnid, a tool useful in removing millions of child sexual abuse material images and videos online.
Get breaking National news
The school division noted that the parents and caregivers of children affected are being notified. Student support teams are available, it said, to help students directed or indirectly impacted.
Parents and caregivers of students are asked to speak to them about the communication they may be engaging in regarding this, especially over Snapchat. The school division is warning against vigilantism, threats of violence, and acts of retribution.
Speaking to 680 CJOB, Winnipeg police Cons. Dani McKinnon said that the police agency is investigating the matter. She said that while this was a “thoughtless” action committed by someone, the sharing of or explicitly altering photos is not new – especially for adults.
“There have been issues before with… (things like) revenge pornography or the distribution of images without consent,” said McKinnon. “But those have been situations where actual images have been sent. We’re now talking about augmented, modified images using AI generation.”
McKinnon said it was brave of the students who came forward to report the incident. Cases like these, she said, are as much about prevention as they are about reporting them.
She further added that there is a gap between what the law says and how technology works. The investigation into the incident will take some time, she said, but officers are working with prosecutors be able to understand how to prosecute this.
Comments