Apple will check pictures of iCloud users for child abuse

CES 2020 events marked an unofficialApple's return to the show, the first time since former CEO John Scully made his debut with Newton's personal digital assistant in 1992. However, the technology giant came to Las Vegas not to demonstrate technological advances, but to draw community attention to the issue of personal data security.

At Apple events under the eloquent name“Confidentiality is a human right” was reported about significant changes in the company’s privacy policy made since December 31, 2019. From now on, all photos uploaded by iCloud customers will be monitored for child abuse scenes, including sexual abuse.

The control process is carried out automaticallymode and is not scanning images in the traditional form. The technique for detecting prohibited content is to identify special digital hash signatures that allow you to identify digital fingerprints that indicate the content of pornographic scenes in the pictures. Thus, the confidentiality of users will be preserved and their pictures will not be subjected to total viewing.

During the presentation of new conditionsIt turned out that the new Apple methodology is similar to the Microsoft PhotoDNA technology, which Facebook, Twitter and Google already use to prevent the dissemination of child pornography content. However, Apple representatives did not mention PhotoDNA technology.

Meanwhile, Microsoft continues to develop the system.warning about prohibited content. The new free system created by Project Artemis does not control images, but user conversations to identify violations related to the sexual exploitation of children.

Under the new privacy terms, Apple wasIt’s also stated that if prohibited content is detected, it will be automatically deleted from iCloud. It is noteworthy that in 2014, Google informed law enforcement authorities that some Gmail users had illegal content.

Source: nakedsecurity