Get the day's biggest stories sent direct to your inbox so you never miss a thing Apple has announced it will delay the launch of a new scanning tool designed to detect child sexual abuse material (CSAM) so they can 'make improvements' after privacy concerns were raised.
The tech giant previously announced plans to introduce new systems which would detect child sexual abuse imagery if someone tried to upload it to iCloud, and report it to authorities.
Apple claimed the process would be done securely and would not regularly scan a user’s camera roll. Read more: Privacy campaigners raised concerns over the plans, with some suggesting the technology could be hijacked by authoritarian governments to look for other types of imagery – something
Read more on manchestereveningnews.co.uk