Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

I wish HN was better at evaluating anything around this, HN's response to anything on this topic makes it hard to know what the actual truth is. Maybe this act is as bad as people are saying it is, but the Apple policy wasn't and the response here was the same. I'll defer judgement until I can actually read it.


> the Apple policy wasn't

The Apple policy on your device ratting you out is abominable and anyone telling you otherwise is trying to sell you a brand.

What you deem illegal might not be the same thing Apple is told to treat as illegal. In this country or elsewhere.

You phone should not be the secret police.


This kind of response illustrates my point.

The apple policy was likely about coming up with a way to enable encrypted photos on iCloud while still having some privacy preserving form of CSAM detection. Since it was only enabled when iCloud photos was enabled it was better for privacy on net than the status quo (unencrypted iCloud photos that are accessible to apple and scanned anyway).

Now we may end up with a worse outcome as a result.

The Bluetooth exposure notification design early in the pandemic was similarly privacy preserving and the average HN response was similarly stupid.

There are just some topics this forum is not a reliable source of accurate information about and this is unfortunately one of them.

This isn’t because I don’t think access to real encryption is incredibly important (I do) - I just think it’s important to get the details right. Otherwise we’ll just get dismissed on these issues for ignoring the specifics and crying wolf on everything.


> The apple policy was likely about coming up with a way to enable encrypted photos on iCloud while still having some privacy preserving form of CSAM detection. Since it was only enabled when iCloud photos was enabled it was better for privacy on net than the status quo (unencrypted iCloud photos that are accessible to apple and scanned anyway).

This is an unsupported hypothetical about a future change Apple may have made. The only announcement they made was the client-side scanning which is at best equivalent to the status quo.


Sure, but it’s not a huge leap to think it through. In hindsight they obviously should have waited until the cloud encryption was ready to ship at the same time.

They probably thought the announcement would be good for PR from the general public even without that and were surprised.

If you’re right that cloud encryption was never the plan then I agree what they did doesn’t make sense and they should have just scanned images on the servers instead of bothering with all the fuss.


You were wrong then and you're wrong now. It's not really possible to be cordial about this topic.

There should never be any process acting against the user's interests on a device that they own. Ever. Full stop. The only reasonable option is to do full encryption on the device without any system that allows inspection or identification of the material being encrypted. It didn't matter that vouchers enabled the decryption of the material after a threshold was hit. There would have been logic running on everyone's device acting as a snitch. At some point that functionality would be expanded and abused.

Your optimistic point of view does not align with the reality of how this kind of technical capability becomes misused over time. The ones that create these things are not the ones that control them 20 years later.


It only ran when iCloud was enabled for photos - this makes it essentially a cloud feature. If this wasn't the case I'd agree with you.

I think it's a better outcome if it leads to iCloud encryption.

A reasonable person could think it's better to just have iCloud remain unencrypted and keep that separation strict in a more pure kind of sense (scan of unencrypted photos on server vs. hash threshold test on upload), but I think that person would have to acknowledge that the policy as described (only being enabled with iCloud photos enabled) is not worse (and if it enabled iCloud encryption is better on net for privacy) than the default in terms of what is specifically happening. It's more of an ideological argument about separation of server/device than what the specific implementation was.


> this makes it essentially a cloud feature.

Wrong. Physically, the routines run on the device. It is not a cloud feature by definition. There are no wormholes here. The work happens on the device. When this work happens the device's battery gets drained so it's happening on the device. It's not a cloud feature. Both technically and physically you are wrong here. I hate that you have this wrong and continue to say it. Stop saying it because you are actually lying.

It's not a better outcome because other companies with the follower-like mentality that most product managers and execs have would attempt to copy and one up Apple only to create a worse and more easily abusable implementation, just like the notch. Just like any socially acceptable easily marketable act that can be hashtagged and spread. That idea would have been an infection of the worst kind.


You're kind of arguing a straw man, sure it technically runs it on the phone (I don't dispute that), but only when upload to iCloud is enabled and the photo is being uploaded to iCloud. The latter bit matters (and what I meant by 'essentially'). Running the check on the phone on upload with these constraints is what would enable iCloud to be encrypted.

I don't think there's much point in discussing further, the main disagreement is already visible in the thread.


> only when upload to iCloud is enabled and the photo is being uploaded to iCloud

You have to trust Apple that this will always be the case.

Your model should be to trust no one. Don't take anyone at their word as it's subject to change at any time. Especially not a corporation which is easily manipulated governments (look how they bend to Russia and China).

It's entirely obtuse that this can be turned with product use. Product use that might be triggered at a distance by simply using the "blessed path". Most people won't even be aware this is happening. And that's beyond shameful.

This puts one foot in the door. There will be more. They'll be ramming everything through that they can to spy on you.

Companies should not be trusted with liberty and privacy. Not even Apple.


> You have to trust Apple that this will always be the case.

You're right of course, but I'd argue this is true in either case. If you're using an iPhone you're trusting Apple is doing what they say they're doing and there's not much you can do about it.

> Your model should be to trust no one. Don't take anyone at their word as it's subject to change at any time. Especially not a corporation which is easily manipulated governments (look how they bend to Russia and China).

I actually agree with this general idea (I work on urbit fwiw), I just think in this case you already have to trust anyway. If they're going to lie and do something differently that's bad - I just think that's independent of this policy and the policy specifics matter.

As it is unencrypted iCloud is a worse state imo, but as I said elsewhere reasonable people can disagree with this. The specific policy as described though isn't worse - it's people's assumption that it increases risk of an abusive policy that is. My take is that that risk is there in either case and really independent of this policy.


people are extremely up-in-arms about something that apple already does, and that every other cloud provider do, but when they made it public everyone went crazy. now we don't have CSAM scanning, totally unencrypted and accessible iCloud Photo Library, and for what? because people want privacy.

you don't own anyone else's cloud and you never will, and especially not with government intervention. while i support privacy, i also think it's like freedom of speech, in theory it sounds great but in reality you end up with nazis walking around if you don't have the ability to deter them.

data on my computer == private without a warrant and due cause. data on someone's server == as private as could be, but i don't own it so i can't demand that it be secure from all aspects, especially uploading CSAM.


>people are extremely up-in-arms about something that apple already does, and that every other cloud provider do

>data on my computer == private without a warrant and due cause.

Apple technology was/will track on your phone directly via AI scanning text messages for nudity. The engine lived on your phone, not in their farm. Therefore your second statement I quoted would now be false. That's why everyone (or the ones in the know) was pissed.


Only when syncing photos to iCloud is enabled and the photo is being uploaded. That’s the critical bit.

If it was operating without cloud access I’d agree with you, but it wasn’t so I don’t.

It only makes sense to do this in the context of encrypting the cloud - otherwise it’s a pointless move that hurts PR.


> Only when syncing photos to iCloud is enabled and the photo is being uploaded.

For now.

If a "feature" like this exists, it WILL be abused like many before it.


I don't really buy this, I don't think this feature makes abuse any more likely and it's important to be loud against features that are actually abuse vs. ones that are just adjacent to something that is abuse. The difference matters, particularly when the pragmatic approach could allow encryption of images on iCloud without any actual downside given the constraints. If people act like all actions are equivalent then we'll just end up ignored by legislators. These differences matter.


I don't think this difference is relevant here.

We've seen many technological features that were supposed to be only used in a very narrow use-cases be abused by governments. Contact tracing technologies (for covid-19) being the most recent victim where they are now being used by governments to track political protesters, criminals, and seemingly anyone else that catches their fancy.

I've seen this cycle play out enough time to know better. Governments never relinquish power. If client-side scanning that can report users to authorities ever becomes wide spread, governments _will_ abuse it. Democracies _will_ suffer and humanity _will_ be worse off for it.


Contact tracing is actually a perfect example.

Early in the pandemic the tech companies came together to create a privacy preserving bluetooth exposure notification system that relied on phones and a clever design to prevent this kind of tracking panopticon you describe. See: https://covid19-static.cdn-apple.com/applications/covid19/cu...

The HN response was ignorant and loudly knee-jerk without understanding how any of this worked. It was stuff like this that caused this to fail to get adopted (see: https://news.ycombinator.com/item?id=25629304 and https://news.ycombinator.com/item?id=26957763). This is partly why I ignore HN on topics like this.

Instead, we ended up with governments just doing it via data brokers and other types of way more invasive tracking.

Our response to encryption policy like Apple's that doesn't decrease privacy could lead to a similarly bad outcome.


>Instead, we ended up with governments just doing it via data brokers and other types of way more invasive tracking.

It sounds like you are blaming people's concern for the first technology for the implementation of the second technology. I believe the implementation of the second technology is evidence that the first technology would have been abused eventually anyway.

How about: government, stop tracking/spying on us without a warrant. Apple, don't use my equipment to spy on me without allowing me to turn it off, and the off switch actually works.


> i also think it's like freedom of speech, in theory it sounds great but in reality you end up with nazis walking around if you don't have the ability to deter them.

I mean this in the most gentle and sincere way: This is just fascism of a different color.

When you support the curtailing of freedom of speech, then you enter into a slippery slope where it's possible that new regimes can use that power against you.

In all but the rarest cases where people are harassed and driven to commit suicide [1], hurtful words are just words. Sticks and stones can break your bones, but words can never hurt you.

If you eliminated all of racism, sexism, and anti-LGBT feelings from the world tomorrow, you would still run into people that dislike you. That's just how people are. We're evolved apes, and a lot of us make in-group/out-group decisions on the most trivial lines. Class, style, designer clothing, attractiveness, rivalry, sports teams, school, etc., etc., etc.

No matter how many layers of protection you put on, you will always run into hate and unpleasantness.

You can't remove one of the most liberating and important features of our democracy - one that anticipates all types of futures and stands against any of their possible tyrannies - just to not have hurt feelings. That would be the mother of all bad trades.

When things get physical, there are laws to protect you. Even hate crime laws that escalate punishments. These are adequate protections.

Your brain is going to have to learn how to dust off the hurt. That's what makes us rugged, fierce, and independent.

I've been called a "f*ggot", bastard (it's true), short, ugly, and many worse, entirely hurtful things throughout my life. I'd like to think I'm doing alright despite it.

[1] Restraining orders and other legal protections for people being abused and bullied, especially online, need to be further developed. That said, there isn't a fine line between what politicians such as Ted Cruz face on a daily basis from Twitter and what pushes individuals such as Near to commit suicide.


For what's it's worse (since we disagree elsewhere in this thread) - I'm with you 100% here.

Free speech is an important principle. The way I argue it to people arguing for various forms of suppressing it is that you can't know that you'll be the one with the power to suppress speech. In a liberal (as in classical liberalism) western society it's a core value. That's why we defend the speech of others even when they say terrible things.

I think there's nuance here with private companies since their ability to moderate is an exercise of their speech, but I think Zuckerberg was generally right in his georgetown address: https://zalberico.com/essay/2020/06/16/mark-zuckerberg-and-f...


* For what it's worth

Not sure how I managed to mangle that so badly.


> you end up with nazis walking around if you don't have the ability to deter them.

In general this is required if you want freedom of speech at all, because while nazis are seen as bad by $majority_general_population today, that same $majority_general_population in 1940's Germany had their own ideals for what people with certain interests were bad. In theory, even if Nazis took over the entire US Federal apparatus today, they couldn't infringe on the rights of the citizens ands states as they did back then (at least not without hostile takeover and civil war).

The entire deal with CSAM being seen as bad today is that it comes from a belief that reducing and limiting the proliferation of digital imagery containing depictions of minors being sexually abused, it will hopefully hurt the profit margins of organizations that run the abduction, human trafficking, and abuse + abuse imagery production process, via reducing exposure and liming the potential customer base (while also hopefully helping victims heal knowing that fewer and fewer people are viewing said imagery). I'm no expert in whether or not this works, perhaps there's some studies that do qualitative research on this; but either way there's no doubt that E2EE directly harms the work of the likes of PhotoDNA which aim at actually solving the issue of file sharing / image hosting services being used for the proliferation of CSAM, whether that be for advertising to new buyers or simply the means that these groups end up sharing content.

So there really is a tradeoff people have to realize when they consider either side's stance; My point is that HN seems to be almost entirely one-sided on the side of privacy even if many safeguards are put in place to make it private while still enabling detection of abuse imagery, when I don't see anyone proposing more practical privacy-respecting solutions than Apple has proposed. Perhaps people thing CSAM is a fringe thing, but [0]

> (Interpol) Child Sexual Exploitation Image Database (ICSEDB) network of 53 countries holds over a half a million CEM images, which have helped identify around 11,988 victims and nearly 5, 617 offenders over the eight years to December 2017

0: Broadhurst, Roderic via https://openresearch-repository.anu.edu.au/bitstream/1885/21...


> So there really is a tradeoff people have to realize when they consider either side's stance

There is no tradeoff to be made. The US government is explicitly prohibited from interfering with the liberty of its citizens. Every elected official swears an oath upholding this principle.


The government interferes with the liberty of a person when they incarcerate them. The constitution doesn’t give you a right to liberty at the expense of anyone else’s liberties.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: