Toronto Star

AI content complicate­s search for victims

Project Aquatic investigat­ion involved 27 police services and resulted in 64 arrests provincewi­de

- JASON MILLER

A recent police-led crackdown on online child sexual exploitati­on and abuse reveals that predators are becoming “more sophistica­ted and harder to trace,” with the use of AI-generated content, which makes it harder for police to differenti­ate between real and synthetic victims, provincial police say.

Dubbed Project Aquatic, the investigat­ion, which ran Feb. 19 to 29, spanned every corner of the province and was able to identify and stop the abuse of 34 child victims. The initiative also clamped down on dozens of individual­s now accused of making and distributi­ng child sexual abuse material, OPP Det. Staff Sergeant Tim Brown said Wednesday at a press conference announcing the results of the project.

Police said the victims range from young children up to teenagers, while those being accused of these crimes also cover a broad spectrum of suspects ranging from teenagers to seniors.

“Together we can turn the tide against these heinous crimes,” Brown said.

Project Aquatic saw officers from the 27 police services teaming up with the OPP-led Provincial Strategy to Protect Children from Sexual Abuse and Exploitati­on on the internet (Provincial ICE [internet Child Exploitati­on] Strategy) to conduct 129 investigat­ions, which resulted in the arrests of 64 individual­s across Ontario who now face a total of 348 charges. Investigat­ors also seized 607 digital devices, police say.

“Predators go where children go,” Brown said.

Brown said in many cases the cycle of abuse for the victims continues for months and years, as there is evidence of the pornograph­ic content being released online, leading to a vicious cycle of it being readily available for viewing and sharing by a countless number of people.

Brown said police are seeing troubling trends in the “digital playground­s where predators lurk,” especially as the tools being used by predators are becoming “more sophistica­ted and harder to trace,” that includes the proliferat­ion of artificial intelligen­ce (AI) generated images, which he says adds yet another element of complexity of police efforts to differenti­ate between real and synthetic victims and apprehend those who make and distribute child sexual abuse material.

That sentiment was echoed by Signy Arnason, associate executive director for the Canadian Centre for Child Protection, who says that in 2022, the agency flagged about 2,600 images it deemed to be AIgenerate­d. That number mushroomed to 3,700 last year, with Arnason saying that with 500 images flagged in just a single month, already this year, “we’re probably on course for 6,000, at least, this year.”

“To say it’s a nightmare would be an understate­ment,” she said, adding that there have been cases of predators using existing child sexual abuse material to create an AIgenerate­d image.

“You can only imagine the nightmare for those victims, who have a long-standing series, and now there’s new abusive imagery of them tied to that,” she said.

It also sprung up among schoolaged children, where Arnason said there has been increasing evidence of students using nude AI generators to develop explicit AI-generated images of schoolmate­s.

“That issue has exploded and we’re dealing with schools right across the country where all of a sudden they’re having this problem,” she said. “We’ve handed kids very powerful tools, and, in some capacity, they might find it funny, but we know this is incredibly traumatizi­ng and damaging.”

Arnason said there is a “growing network of adults with a problemati­c sexual interest in children,” who operate in online communitie­s where they share child sexual abuse material, promote tactics that include manuals on how to evade detection and “normalize the sexual abuse and exploitati­on of children.”

Some predators show signs of being obsessed over the victims and go as far as trying to track them down and even stalk them well into adulthood, Arnason said.

“Environmen­ts like the dark web fester and facilitate this conduct and AI-generated images have tipped the scale on an already epidemic-sized issue,” she said Wednesday. “We know that police across Canada cannot keep up.”

She said technology companies must shoulder some of the blame as she has seen little signs that these brands will voluntaril­y prioritize child safety. Both Arnason and the police said stronger government action and regulation is needed to protect children online.

Brown said the bulk of Project Aquatic involved police investigat­ing complaints from electronic service provides such as social media platforms, that are obligated to report troubling online behaviour.

Of the almost 30,000 cybertips.ca received last year, roughly 23 per cent came from Ontario. Police say with statistics showing that 77 per cent of Canadian children, ages nine to 17, having access to a smartphone, it opens the door to increasing threats from online predator to target young people.

Brown said there are measures to safeguard children, include being vigilant, reporting concerns or signs of abuse to cybertip.ca. It will take collect vigilance from parents, educators and internet providers about the realities of child luring, child grooming and the dangers posed online, Brown said.

 ?? R.J. JOHNSTON TORONTO STAR FILE PHOTO ?? OPP Det. Staff Sgt. Tim Brown said the tools being used by child sex predators are becoming “more sophistica­ted and harder to trace,” and that includes the proliferat­ion of artificial­intelligen­cegenerate­d images.
R.J. JOHNSTON TORONTO STAR FILE PHOTO OPP Det. Staff Sgt. Tim Brown said the tools being used by child sex predators are becoming “more sophistica­ted and harder to trace,” and that includes the proliferat­ion of artificial­intelligen­cegenerate­d images.

Newspapers in English

Newspapers from Canada