The Denver Post

Tech companies detect surge in child sexual abuse videos

- By Gabriel J.X. Dance and Michael H. Keller

The number of reported photos, videos and other materials related to online child sexual abuse grew by more than 50% last year, an indication that many of the world’s biggest technology platforms remain infested with the illegal content.

Nearly 70 million images and videos were reported to the National Center for Missing and Exploited Children, a federally designated clearingho­use for the imagery that works with law enforcemen­t agencies.

The record number was driven by a surge in illegal videos, which are popular among sexual predators but now are more readily detected by some companies. More than 41 million videos were reported; the number five years ago was fewer than 350,000. The companies flagged many of the same images and videos multiple times as they were shared among users.

The center identified to The New York Times the companies that had detected the imagery, the first time detailed company informatio­n had been released.

Facebook reported nearly 60 million photos and videos, more than 85% of the total. The number reflects its immense user base and its aggressive approach to rooting out the material. But it shows offenders continue to exploit the platform. Nearly half of the content was not necessaril­y illegal, according to the company, and was reported to help law enforcemen­t with investigat­ions. Instagram, which is owned by Facebook, was responsibl­e for an additional 1.7 million photos and videos.

In a statement, Antigone Davis, Facebook’s global head of safety, said “the size and expertise of our team, together with our sophistica­ted technology, have made us industry leaders in detecting, removing and reporting these images, and thwarting people from sharing them.”

“We will continue to develop the best solutions to keep more children safe,” she added.

Snapchat, Twitter and other social media companies also submitted reports of imagery. So did companies whose services include search engines and cloud storage, including Google and Microsoft. Apple, Dropbox and chat platform Discord also detected the illegal content.

In all, 164 companies submitted reports.

“These numbers show that any service provider that allows individual­s to host images and videos are susceptibl­e to child sexual exploitati­on material being posted,” said John Shehan, a vice president at the national center.

He confirmed the numbers released Friday reflected all content reported to the center, including material that “may not meet the legal definition of child pornograph­y.”

Still, the numbers do not paint a complete picture of the problem: The industry has been plagued by uneven and inconsiste­nt detection practices. Some cloud storage services, including those owned by Amazon and Microsoft, do not scan for illegal content at all, while others scan for photos but not videos.

 ??  ??
 ??  ??

Newspapers in English

Newspapers from United States