Over a year ago, Apple announced plans to scan for child sexual abuse material (CSAM) with the iOS 15.2 release. The technology is inevitable despite imperfections and silence about it. And then, the ...
Companies with an online presence must be vigilant of current and proposed legislation aimed at protecting children online. With the growing use of artificial intelligence (AI), companies face an ...
The European Commission has again been urged to more fully disclose its dealings with private technology companies and other stakeholders, in relation to a controversial piece of tech policy that ...
After years of controversies over plans to scan iCloud to find more child sexual abuse materials (CSAM), Apple abandoned those plans last year. Now, child safety experts have accused the tech giant of ...
When Apple announced its plans to tackle child abuse material on its operating systems last week, it said the threshold it set for false positives account disabling would be one in a trillion per year ...
A pair of Princeton researchers claim that Apple's CSAM detection system is dangerous because they explored and warned against similar technology, but the two systems are far from identical. Jonathan ...
Earlier this year, Apple announced a new system designed to catch potential CSAM (Child Sexual Abuse Material) by scanning iPhone users’ photos. After an instant uproar, Apple delayed the system until ...
Respected university researchers are sounding the alarm bells over the technology behind Apple's plans to scan iPhone users' photo libraries for CSAM, or child sexual abuse material, calling the ...
A Wisconsin man was arrested in May 2024 on criminal charges related to his alleged production, distribution, and possession of AI-generated images of minors engaged in sexually explicit conduct and ...
Update: The EU has now announced the proposed new law. More details at the bottom. Apple’s CSAM troubles may be back, after controversy over the issue of scanning iPhones for child sexual abuse ...
Apple on Friday announced that the three features it revealed to stop the spread of Child Sexual Abuse Material (CSAM) will not be available at the fall release of iOS 15, iPadOS 15, watchOS 8, and ...
Some results have been hidden because they may be inaccessible to you
Show inaccessible results