Apple is planning to design a system that will use new cryptography techniques and better artificial intelligence that can recognize child sexual abuse material when stored in the iCloud, seeking to bridge the eternal split between the company’s pledge to protect customer privacy and law enforcement to be aware of whatever illegal activity occurs on the device.
This brand had developed a maximum of its brand image based on the promise to protect users’ privacy. Their new software will be even better, with the protections more enhanced and no requirement for further scanning of the images on the company’s servers. Apple also has child safety measure plans, where they will scan the user’s encrypted messages in case there is any sexually explicit content.
The new detection system will flag all images that indicate child pornography, as per the center’s database.
After Apple’s new plans were leaked this Wednesday, critics commented, “they worried that by building software that can flag illegal content belonging to its users, Apple may be softening its stance on how it protects user data via encryption – a source of growing contention between the technology giant and law enforcement organizations over the past decade.”
Bottom Line: Apple’s new plan is to develop the software that runs both on Apple’s clouds and on iPhones to detect child abuse/ child pornography contents.