Apple's Child Safety Features

Runs For Fun

Masochist
Site Donor
Posts
2,057
Reaction score
3,034
Location
Ohio
Related to this thread.

I want to continue the discussion on this initiative by Apple as I think it's an important discussion to have with regards to privacy. I have posted my thoughts on it in a few posts in that thread as have many other members. Let's continue to discuss here.

 

SuperMatt

Site Master
Posts
7,862
Reaction score
15,004
Related to this thread.

I want to continue the discussion on this initiative by Apple as I think it's an important discussion to have with regards to privacy. I have posted my thoughts on it in a few posts in that thread as have many other members. Let's continue to discuss here.

Pixel Envy has covered this pretty well, I think:

 
D

Deleted member 221

Guest
I'm a big fan of Ben Thompsons coverage of it.


I am not anti-encryption, and am in fact very much against mandated backdoors. Every user should have the capability to lock down their devices and their communications; bad actors surely will. At the same time, it’s fair to argue about defaults and the easiest path for users: I think the iPhone being fundamentally secure and iCloud backups being subject to the law is a reasonable compromise.

Apple’s choices in this case, though, go in the opposite direction: instead of adding CSAM-scanning to iCloud Photos in the cloud that they own and operate, Apple is compromising the phone that you and I own-and-operate, without any of us having a say in the matter. Yes, you can turn off iCloud Photos to disable Apple’s scanning, but that is a policy decision; the capability to reach into a user’s phone now exists, and there is nothing an iPhone user can do to get rid of it.

A far better solution to the “Flickr problem” I started with is to recognize that the proper point of comparison is not the iPhone and Facebook, but rather Facebook and iCloud. One’s device ought be one’s property, with all of the expectations of ownership and privacy that entails; cloud services, meanwhile, are the property of their owners as well, with all of the expectations of societal responsibility and law-abiding which that entails. It’s truly disappointing that Apple got so hung up on its particular vision of privacy that it ended up betraying the fulcrum of user control: being able to trust that your device is truly yours.
 

Runs For Fun

Masochist
Site Donor
Posts
2,057
Reaction score
3,034
Location
Ohio
instead of adding CSAM-scanning to iCloud Photos in the cloud that they own and operate
I’m a little confused by this statement. Apple already does scan iCloud photos for CSAM.

Pixel Envy has covered this pretty well, I think:

Very good write up. I think this is a pretty key point
and its promises to use them solely for good have fallen flat because it so badly flopped its communication.
As I mentioned before, Apple have failed before in their communication.
 
U

User.191

Guest
I’m a little confused by this statement. Apple already does scan iCloud photos for CSAM.


Very good write up. I think this is a pretty key point

As I mentioned before, Apple have failed before in their communication.
There are two universal constants with Apple:

1) They'll always release new phone models each and every year.

2) They'll always fuck up the most pertinent of all PR needs at the worst possible time.
 
U

User.45

Guest
Very good find. Thanks @P_X .

I am going to let you guys in on a big secret. There is no AI, or neural networks, or whatever term the marketing department comes up with. It's really just a bunch of data in different digital forms, a bunch of tight or loose comparisons, as required for the solution, and a whole metric fuck ton of set theory to get something somewhat sensible out of multiple sets of data after the comparisons.

I have consulted to companies where I sat with the CIO who was asking me about how to use AI to predict regression issues when they add enhancements to existing systems, so that they can focus their QA effort and reduce bugs going into production. The guy had spent a few hours listening to the BS the week before, because he had been invited to the "AI lab" by one of the big 3 consulting companies and he became a true believer. I did not know whether to laugh, or cry.
1628587839930.png
 

Runs For Fun

Masochist
Site Donor
Posts
2,057
Reaction score
3,034
Location
Ohio
I would definitely recommend listening to episode 365 of Upgrade. They spent most of the show discussing this and picking it apart from all kinds of angles. A very interesting listen.
 
D

Deleted member 221

Guest
Listen to security researchers and cryptography experts, not Apple pundits.

A good example: https://twitter.com/matthew_d_green

I beg you all.
Please go way outside John Gruber, Myke Hurley, Rene Ritchie, iJustine, Federico..

All seemingly good people who are mostly out of their depth here..
..but more importantly, most all have their entire financial livelihood entangled with Apple and their success, etc

Very very difficult for folks in that position to see Apple and potential changes through a clear lens.
 
Last edited by a moderator:
D

Deleted member 221

Guest
Matthew Green had a great thread today. I'll post it below so as to not force folks to figure out compiling the twitter thread, etc

Everyone keeps writing these doomed takes about how “the US government is going to force tech companies to comply with surveillance, so they might as well just give in preemptively.” Like it’s inevitable and we should just hope for what scraps of privacy we can.

Even I was pessimistic last week. What I’ve seen in the past week has renewed my faith in my fellow countrymen — or at least made me realize how tired and fed up of invasive tech surveillance they really are.

People are really mad. They know that they used to be able to have private family photo albums and letters, and they could use computers without thinking about who else had their information. And they’re looking for someone to blame for the fact that this has changed.

People are telling me that Apple are “shocked” that they’re getting so much pushback from this proposal. They thought they could dump it last Friday and everyone would have accepted it by the end of the weekend.

I think that reflects Apple accepting the prevailing wisdom that everyone is just fine having tech companies scan their files, as long as it’s helping police. But that’s not the country we actually live in anymore.

Anyway, I don’t revel in the fact that Apple stuck their heads up and got them run over by a lawn mower. I like a lot of the people on Apple’s security team (I turned down a job there a few years ago.) But people need to update their priors.

At the end of the day, tech companies do care a lot about what their users want. Apple has heartburn about this *not* because Congress passed a law and they have to do it. They’re panicked because they did it to themselves, and they can’t blame Congress.

A few folks in Congress, for their part, have been trying for years to pass new laws that force providers to include mandatory backdoors like this new Apple one. They failed repeatedly. In part they failed because these systems aren’t popular.

And so the shell game has been to play one against the other. Congress can’t quite pass laws requiring backdoors because there’s no popular support. But providers somehow have to do it voluntarily because otherwise Congress will pass laws.
 

fischersd

Meh
Posts
1,182
Reaction score
840
Location
Coquitlam, BC, Canada
Apple will implement this country by country - they know they're going to have legal hurdles (and, yes, I'm also not a proponent for the device-side scanning).
It'll likely never be allowed in EU for many reasons. Of course most of us look at this and think countries like China will demand access. (does Apple do business in Afghanistan? Guaranteed the Taliban will want to use this to persecute their people).

I don't envy those of you in the US - you're first on the block. Apple's going to meet their first legal challenges on this in their own backyard.
 
Top Bottom
1 2