Fears over Apple plan to monitor photos on iPhones
PRIVACY campaigners told of their fears last night at news Apple plans to scan phones for child abuse imagery.
There were concerns the move for US iPhones could open the back door to the surveillance of millions of devices.
The tech giant intends to install software that will sift through personal photo albums searching for illegal content, according to the Financial Times.
Human reviewers will contact the police if any material detected by the algorithm is verified.
The move is part of Apple’s attempt to find a balance between its promise to protect customers’ privacy and increasing demands to assist in criminal investigations. But security experts branded the proposals ‘absolutely appalling’. While the software is currently designed to spot child sex abuse, campaigners said it could be adapted to spot other images – such as antigovernment signs at protests.
The algorithm – called neuralMatch – will scan photos stored on a user’s iPhone and which have been uploaded to the iCloud backup system, it was claimed.
Apple bosses were said to have pitched the software to US academics this week.
Ross Anderson, professor of security engineering at Cambridge University, said: ‘It is an absolutely appalling idea, because it is going to lead to distributed bulk surveillance of our phones and laptops.’ Alec Muffett, a security researcher and privacy campaigner who formerly worked at Facebook and Deliveroo, described the plan as a ’huge and regressive step for individual privacy’.
Social networks and cloud-based photo storage systems already scan for child abuse imagery. Accessing the data on a personal device is far more complex however. Apple said last night the technology would be part of its iOS and iPadOS 15 software update due in the autumn.
It will initially be introduced in the US, with plans to expand further over time.