> This does not prevent any of these threats, it does not even necessarily make them more difficult whatsoever. "Insiders" will still have access to the source code doing the encryption, and it is just not possible to protect against government overreach that can literally force you to do anything and keep quiet about it, even in otherwise relative sane countries. Search for NSA letter.
There you go again :)
You literally just said something that used to take a subpoena from any law enforcement now takes an NSA letter. And that an insider attack that used to mean retrieving a backup file now means inserting back doors in source code that go undetected.
And somehow those aren't even more difficult?
> Because I have literally used the same compiler I use for other platforms
It is literally provable that Apple will never be able to satisfy you. For any mitigation they introduce, you can (rightfully) create a hole in that mitigation.
What you're missing is that the same flaws and attacks appear in all of your "it would be better if" solutions. Once you're invoking NSA letters and malicious source code, all bets are off... including for open source.
> It just can't academically work.
Yes, we agree on that. But it also doesn't work if you're protecting stuff from Alice by trusting Bob, who might be secretly an agent of Alice.
> You literally just said something that used to take a subpoena from any law enforcement now takes an NSA letter
I didn't say that. You said "overreaching government".
> It is literally provable that Apple will never be able to satisfy you
Nothing _technical_, that is, which has exactly been my point.
> Once you're invoking NSA letters and malicious source code, all bets are off... including for open source.
That's not true at all. There's an entire world of difference where "oh the software is just hidden from my eyes, communicating constantly and opaquely with the mothership, changeable at any moment by the same mothership, and all of it running in the same hardware also made by the same mothership" versus "I have these separate components that are only communicating through these channels in these clearly specified ways". The first only allows useless technobabble fake solutions, the second system actually allows discussion about trust and is usually the very minimum expectation of any cryptosystem.
> But it also doesn't work if you're protecting stuff from Alice by trusting Bob, who might be secretly an agent of Alice.
I don't see that as necessarily true either. But anyway, I can now choose between multiple providers for encryption, which _finally_ goes towards measurably increasing trust. Remember, despite the accusations, I have never claimed it had to be 100% trusting trust perfect, I am just claiming this one proposal is 100% useless. If you didn't trust Apple backups before and you would now, I'd question your judgement.
There you go again :)
You literally just said something that used to take a subpoena from any law enforcement now takes an NSA letter. And that an insider attack that used to mean retrieving a backup file now means inserting back doors in source code that go undetected.
And somehow those aren't even more difficult?
> Because I have literally used the same compiler I use for other platforms
https://www.awelm.com/posts/evil-compiler/
It is literally provable that Apple will never be able to satisfy you. For any mitigation they introduce, you can (rightfully) create a hole in that mitigation.
What you're missing is that the same flaws and attacks appear in all of your "it would be better if" solutions. Once you're invoking NSA letters and malicious source code, all bets are off... including for open source.
> It just can't academically work.
Yes, we agree on that. But it also doesn't work if you're protecting stuff from Alice by trusting Bob, who might be secretly an agent of Alice.