Get the day's biggest stories sent direct to your inbox so you never miss a thing Last week, Apple announced its plans to introduce new scanning technology in a bid to tackle child sexual abuse material (CSAM).
The system, known as "NeuralHash," according to a technical summary published by Apple, will scan the iCloud for images it believes are related to child abuse.
Any matched images will be flagged to human operators who can review the image and contact law enforcement where appropriate. Read more: Driver who 'nearly crashed' told police he was 'taking his friends home' However, privacy advocates are worried that the technology, while made with good intentions, could introduce a backdoor "that threatens to undermine fundamental
Read more on manchestereveningnews.co.uk