PGC – On its face, Apple’s decision to scrub the phones owned by other people looking for inappropriate photos beneath the age of consent might seem like a great idea, but is there risk to giving a company such a power as this? What innocents might be caught up in bad AI interpretations that later become owned by the human judges for fear of undermining the system.
But Apple isn’t just scanning for the sexually disgusting, but also evidence of child abuse, an area even more grey and uncertain than what constitutes illicit and something else when it comes to pictures of children.
Expect to read soon of real humans, innocent humans, caught in this centrally-controlled web by an AI bad take.
Now, imagine the value to almost anyone who could afford it of the database that could tell you the names of the people attached to scans that maybe didn’t cross a legal line but might expose them to something else. Imagine if these poeple were influencers, be they thought leaders, politicians, religious leaders, etc.
Apple does deals with China, they’re clearly not against states that murder their citizens for simply disagreeing with them, so how much do you trust them with such a database as they will inevitably create?
Apple to start scanning US iPhones for images of child sexual abuse
2021-08-06 03:32:30
NEWS WIRES
Excerpt:
Issued on:
Apple unveiled plans to scan U.S. iPhones for images of child sexual abuse, drawing applause from child protection groups but raising concern among some security researchers that the system could be misused, including by governments looking to surveil their citizens.
The tool designed to detected known images of child sexual…

