Read Time:1 Minute, 58 Second

Apple, one of the leading companies in artificial intelligence, has generated widespread criticism from privacy advocates. This time, Apple is building a cloud-based database to automatically scan users’ personal photos for images that may have been posted on erotic or adult image sharing websites.

On August 21, Apple announced a proposal to check user-stored images in iCloud for evidence of child sexual assault. 

The company’s decision aims to protect user privacy while enabling the business to flag potentially harmful and abusive content without disclosing any other information.

However, it quickly received strong criticism from privacy and security experts and digital rights groups, who were worried that the surveillance capabilities could lead to privacy and security risks for iCloud users worldwide.

Beginning in September 2021, Apple announced that it would stop the feature’s deployment to “gather feedback and make changes before delivering these critically essential kid safety features,” indicating that a launch was still on the way.

Apple also said the CSAM-detection tool for iCloud photos had been discontinued in light of the comments and recommendations it has received.

Last week, Apple revealed that it is concentrating its anti-CSAM efforts and investments on its “Communication Safety” capabilities, which were first introduced in December 2018 after being first mentioned in August 2021. Through family iCloud accounts, parents and other caregivers can choose to enable the protections.

“After extensive consultation with experts to gather feedback on child protection initiatives we proposed last year, we are deepening our investment in the Communication Safety feature that we first made available in December 2021,” the company shared  in a statement.

Along with the news that Apple is significantly increasing its end-to-end encryption options for iCloud, including adding security for backups and photographs saved on the cloud service, the firm announced on December 7 that it is updating its CSAM software.

“Potential child exploitation can be interrupted before it happens by providing opt-in tools for parents to help protect their children from unsafe communications,” the company added.

“Apple is dedicated to developing innovative privacy-preserving solutions to combat Child Sexual Abuse Material and protect children while addressing the unique privacy needs of personal communications and data storage.”

In the end, Apple focused on other AI solutions instead of continuing with the development of CSAM technology, which could harm users’ privacy.

Happy
Happy
0 %
Sad
Sad
0 %
Excited
Excited
0 %
Sleepy
Sleepy
0 %
Angry
Angry
0 %
Surprise
Surprise
0 %
Person Editing Photos on Mobile Phone Previous post Lensa AI: The Future of Data Protection or the End of Privacy as We Know It?
EDU woman login twitter app mobile phone Next post Musk’s Twitter Files Are a Hit Among Conspiracy Theorists