New Delhi: Apple has pulled all the references to its controversial child sexual abuse material (CSAM) detection feature, originally announced in August, from its webpage. The CSAM detection that saw a delay in launch following backlash from privacy advocates, is now removed. It should be noted that the Cupertino, California-based tech giant’s plans to detect child sexual abuse photos on iPhones and iPads may hang in the balance.
The iPhone maker’s update to the webpage on child safety features reportedly took place between December 10-December 13. However, some reports suggest the company may not add image detection any time soon.
Two of the three safety features related to child sexual abuse that was rolled out earlier this week with the latest iOS 15.2, are still present on the page and it is titled “Expanded Protections for Children”.
The iPhone maker has maintained its position since September when it had initially announced it would be delaying the launch of the CSAM detection.
“Based on feedback from customers, advocacy groups, researchers and others, we have decided to take additional time over the coming months to collect input and make improvements before releasing these critically important child safety features,” Apple had said in a statement.
Apple drew the ire of security researchers, whistleblower Edward Snowden, Facebook’s former security chief, policy groups and several others after the CSAM detection feature was announced as it involves taking hashes of iCloud Photos and comparing them to a database of hashes of known child sexual abuse imagery.
(This is an auto-generated article from syndicated news feed. TEAM BEPINKU.COM may not have modified or edited the article).