Advertisement

Special web tool finds 5.4M child porn images online: report

A special web tool "crawled" the internet to find child porn images still online. Pexels

The Canadian Centre for Child Protection (C3P) is calling on the tech industry to do more to stop the spread of child pornography online.

In a report released last week, the organization said it has detected more than 5.4 million images of child sexual abuse material using a special web tool called Project Arachnid.

The platform was designed to “crawl” websites reported to cybertip.ca, and automatically notify Electronic Service Providers (ESP) to remove the offensive materials. The organization says Project Arachnid currently has a backlog of nearly 33 million suspect images that have yet to be assessed.

“This type of report has never really been done before in the world,” said Signy Arnason, C3P associate executive director. “We wanted to look at the data and see what was happening with ESP removal times, how often images are reappearing after being removed, basically how well industry is doing in this space.”

Story continues below advertisement

The report is comprised of data collected over the last three years, and its findings are not good.

Get the day's top news, political, economic, and current affairs headlines, delivered to your inbox once a day.

Get daily National news

Get the day's top news, political, economic, and current affairs headlines, delivered to your inbox once a day.
By providing your email address, you have read and agree to Global News' Terms and Conditions and Privacy Policy.

“We saw delays in removal time, it took up to 42 days in one case,” said Arnason. “A lot of images reappear on servers over and over again after they’ve been removed.”

Part of the problem, according to Arnason, is a lack of regulation from the federal government.

“We have some very strong views on what needs to be done from a regulatory standpoint in order for this to be effective,” she said.

The 60-page report outlines eight recommendations to address the issue. These include requiring providers to self-monitor more effectively and for the government to mandate human content moderation standards.

“This falls under the heritage minister,” said Arnason, “so we look forward to having those ongoing discussions (with him) because, at the end of the day, we are in the trenches seeing every single day what children are facing. And the picture is pretty abysmal.”

Sponsored content

AdChoices