Otago Daily Times

Facebook removes 8.7 million sexual photos of children

-

SAN FRANCISCO: Facebook Inc said yesterday company moderators during the last quarter removed 8.7 million user images of child nudity with the help of software that automatica­lly flags such photos.

The software, introduced over the past year, identifies images that contain both nudity and a child, allowing increased enforcemen­t of Facebook’s ban on photos that show minors in a sexualised context.

A similar system also disclosed yester day catches users engaged in ‘‘grooming’’, or befriendin­g minors for sexual exploitati­on.

Facebook global head of safety Antigone Davis said the software ‘‘helps us prioritise’’ and ‘‘more efficientl­y queue’’ problemati­c content for the company’s trained team of reviewers.

The company is exploring applying the same technology to its Instagram app.

Under pressure from regulators and politician­s, Facebook has vowed to speed up the removal of extremist and illicit material.

Machine learning programs that sift through the billions of pieces of content users post each day are essential to its plan.

Machine learning is imperfect, and news agencies and advertiser­s are among those that have complained this year about Facebook’s automated systems wrongly blocking their posts.

Davis said the child safety systems would make mistakes but users could appeal. — Reuters

Newspapers in English

Newspapers from New Zealand