Apple to Detect Sexually Explicit Child Photos on IPhone (1)

Aug. 5, 2021, 10:36 PM UTC

Apple Inc. said it will launch new software later this year that will analyze photos stored in a user’s iCloud Photos account for sexually explicit images of children and then report instances to relevant authorities. The moves quickly raised concerns with privacy advocates.

As part of new safeguards involving children, the company also announced a feature that will analyze photos sent and received in the Messages app to or from children to see if they are explicit. Apple also is adding features in its Siri digital voice assistant to intervene when users search for related abusive material. The Cupertino, California-based ...

Learn more about Bloomberg Law or Log In to keep reading:

See Breaking News in Context

Bloomberg Law provides trusted coverage of current events enhanced with legal analysis.

Already a subscriber?

Log in to keep reading or access research tools and resources.