Apple running into a giant slippery slope?

Trunks0

Keeping an open mind
Staff member
Administrator
https://arstechnica.com/tech-policy...ll-scan-photos-for-child-sexual-abuse-images/

Snippet from ArsTechnica
Arstechnica.com said:
Apple explains how iPhones will scan photos for child-sexual-abuse images
Apple offers technical details, claims 1-in-1 trillion chance of false positives.

Shortly after reports today that Apple will start scanning iPhones for child-abuse images, the company confirmed its plan and provided details in a news release and technical summary.

"Apple's method of detecting known CSAM (child sexual abuse material) is designed with user privacy in mind," Apple's announcement said. "Instead of scanning images in the cloud, the system performs on-device matching using a database of known CSAM image hashes provided by NCMEC (National Center for Missing and Exploited Children) and other child safety organizations. Apple further transforms this database into an unreadable set of hashes that is securely stored on users' devices."

Apple provided more detail on the CSAM detection system in a technical summary and said its system uses a threshold "set to provide an extremely high level of accuracy and ensures less than a one in one trillion chance per year of incorrectly flagging a given account."

The changes will roll out "later this year in updates to iOS 15, iPadOS 15, watchOS 8, and macOS Monterey," Apple said. Apple will also deploy software that can analyze images in the Messages application for a new system that will "warn children and their parents when receiving or sending sexually explicit photos."

Apple accused of building “infrastructure for surveillance”

Despite Apple's assurances, security experts and privacy advocates criticized the plan.

"Apple is replacing its industry-standard end-to-end encrypted messaging system with an infrastructure for surveillance and censorship, which will be vulnerable to abuse and scope-creep not only in the US, but around the world," said Greg Nojeim, co-director of the Center for Democracy & Technology's Security & Surveillance Project. "Apple should abandon these changes and restore its users' faith in the security and integrity of their data on Apple devices and services."

For years, Apple has resisted pressure from the US government to install a "backdoor" in its encryption systems, saying that doing so would undermine security for all users. Apple has been lauded by security experts for this stance. But with its plan to deploy software that performs on-device scanning and share selected results with authorities, Apple is coming dangerously close to acting as a tool for government surveillance, Johns Hopkins University cryptography Professor Matthew Green suggested on Twitter.

:eek: that's a hell of a slippery slope imho.
 
It seems like a good idea, but I think it might be a little too far. I dont want the FBI coming to my house because I sent a picture of my nephew to someone and the algorithm thought it was some missing kid.


Its just an apple backdoor that they can use to scan any picture later on instead of missing children. It will make apple billions selling uses to governments. Hopefully theres enough outcry to stop them from doing so.
 
What could possibly go wrong? :bleh:

Apple is looking for incriminating evidence without probable cause. It's like consenting your landlord to search your property for contraband in the background without probable cause. You wouldn't want law enforcement doing it. Why allow a 3rd party to?:bleh:

It's bad enough social media snoop your browsing history to feed you ads. I don't need apps claiming private files on my personal storage and sharing them with anyone without permission.:bleh:
 
I'm alittle more concerned about future abuse. What's to stop that system from considering other info vorboten? Like say Xi and Winnie the Pooh? What's stopping Apple from expanding this to scanning everything in iCloud? Or any Apple OS files? Who might pressure Apple to do that etc etc etc. It's a pandora's box that Apple is insane to even suggest opening.

Can tell ya this though. **** using Apple in any even remotely secure environment.
 
As someone who got filtered by AI this morning trying to post a picture of my dog on Discord, the AI thinking my picture was explicit, this absolutely is going to go poorly.
 
I can see a day when the police pick you up and instead of using the one phone call for an attorney, you use if to call tech support. Hopefully time on hold will count towards time served.
 
I can see it now.

1) You post a picture of your toddler in his/her swimsuit with their little butt crack showing. You think it's adorable. You never intend to post it on social media, but you save the image as you do all other memories.
2) Apple thinks it's kiddie porn and reports it to the LEO
3) Your house is raided in full daylight, with all neighbors watching.
4) A news article says you were arrested on allegations of child p0rn.
5) Your life is ruined because social media has labeled you a kiddie diddler and your story and picture goes viral. Your real name, address and phone numbers are posted publicly.
6) After investigation by LEO, the charges are dropped.
7) Social media still assumes you're guilty but got away with it. Your life is ruined. You have to move. It becomes almost impossible to get a job.

I wish these sounded far-fetched. But they aren't.
 
I can see it now.

1) You post a picture of your toddler in his/her swimsuit with their little butt crack showing. You think it's adorable. You never intend to post it on social media, but you save the image as you do all other memories.
2) Apple thinks it's kiddie porn and reports it to the LEO
3) Your house is raided in full daylight, with all neighbors watching.
4) A news article says you were arrested on allegations of child p0rn.
5) Your life is ruined because social media has labeled you a kiddie diddler and your story and picture goes viral. Your real name, address and phone numbers are posted publicly.
6) After investigation by LEO, the charges are dropped.
7) Social media still assumes you're guilty but got away with it. Your life is ruined. You have to move. It becomes almost impossible to get a job.

I wish these sounded far-fetched. But they aren't.
Why would you take a picture of your kid’s butt crack?

:bleh:
 
Why would you take a picture of your kid’s butt crack?

:bleh:

Most parents think that everything their kid does is cute when they're little. Especially if they're half-naked. Or maybe that's just the way it was back in my day, I dunno. My mom has an old Polaroid of me naked with a pot on my head when I was 2. She thought it was adorable.

By today's standards....yeah, pretty weird.
 
I was joking man. I was trying to pretend to be Apple in my post.

Now the frog is dissected.
 
Most parents think that everything their kid does is cute when they're little. Especially if they're half-naked. Or maybe that's just the way it was back in my day, I dunno. My mom has an old Polaroid of me naked with a pot on my head when I was 2. She thought it was adorable.

By today's standards....yeah, pretty weird.

apple algorithm has just identified the following caution words:

Kid - Little - half-naked - Naked with pot - 2 years old - adorable.

AI analysis

<warning>

Siri has identified potiential criminal activity involving Munkus and a half naked 2 year old child in possession of marijuana. Local authorities have been informed and dispatched immediately to the residence.
 
This is one of those "There is no good solution" subjects...
Doing it on device, against *known* images, who have iCloud backup enabled, and reaching a threshold before it gets reviewed... yea, this isn't going to catch much me thinks. But it does set a concerning step towards invasion of privacy, and adding a tool for mass surveillance, even if it is "at a threshold".
 
This is a horrible idea. My kids are 12 and older, but as munkus layed out what would probably happen. This is just a really bad idea.

I hate a pedo more than anything, but this is a VERY slippery slope.
 
The optimist in me hopes Apple is doing this knowing it will fail and will remove it later. So that in the future they can go "We tried, it doesn't work".

But the realist in me thinks this is going to get abused and will never go away.
 
The optimist in me hopes Apple is doing this knowing it will fail and will remove it later. So that in the future they can go "We tried, it doesn't work".

But the realist in me thinks this is going to get abused and will never go away.

I don't think it will go away. It'll be tweaked, turned every which way but after a while I'll bet people will just forget about it, kind of like how FB data mines peoples entire lives and they just don't care....
 
We just have to assume that everything we do online is monitored, recorded, analyzed, and sold to advertisers or given to law enforcement on demand. So long as we keep using those services or buying those products, we're agreeing to it.
 
Back
Top