Facebook removes 8.7 million sexual photos of children
SAN FRANCISCO: Facebook Inc said yesterday company moderators during the last quarter removed 8.7 million user images of child nudity with the help of software that automatically flags such photos.
The software, introduced over the past year, identifies images that contain both nudity and a child, allowing increased enforcement of Facebook’s ban on photos that show minors in a sexualised context.
A similar system also disclosed yester day catches users engaged in ‘‘grooming’’, or befriending minors for sexual exploitation.
Facebook global head of safety Antigone Davis said the software ‘‘helps us prioritise’’ and ‘‘more efficiently queue’’ problematic content for the company’s trained team of reviewers.
The company is exploring applying the same technology to its Instagram app.
Under pressure from regulators and politicians, Facebook has vowed to speed up the removal of extremist and illicit material.
Machine learning programs that sift through the billions of pieces of content users post each day are essential to its plan.
Machine learning is imperfect, and news agencies and advertisers are among those that have complained this year about Facebook’s automated systems wrongly blocking their posts.
Davis said the child safety systems would make mistakes but users could appeal. — Reuters