|
CmdrRiker posted:I love it when people who know nothing about software security think they know everything about software security. https://www.npr.org/2020/02/21/805032627/trump-administration-targets-your-warrant-proof-encrypted-messages surely we can put a back door on this physical combination lock, like a 2nd secret key but make it illegal for criminals to use it!
|
# ? Feb 21, 2020 15:29 |
|
|
# ? Apr 25, 2024 08:57 |
|
Krankenstyle posted:what a bizarre cadence It's kinda his thing. I recommend "rules for rulers", as an aside.
|
# ? Feb 21, 2020 16:06 |
|
Pile Of Garbage posted:i'd bet that the FBI are aware of the security implications of backdoors but they don't care because they want to lock-up minorities real bad oh yeah totally, I guarantee they fully know what they're doing, or at least their technical team that tells the spokespeople what to say does
|
# ? Feb 21, 2020 16:14 |
|
exmachina posted:CGP grey has a great video to explain this concept to non technical people: this is not quite the full story though, because its is only about encryption. specifically wrt unlocking a phone theres no actual need to break encryption. if you can provide an unlock code the os accepts it will decrypt the data for you. this means you could give the government a way to unlock the phone without giving them a way to break the actual encryption. the phone vendor could theoretically provide a signed OS update that makes the unlock easier, like say hardcoding it in the verification logic or maybe adding some external means to reset it. this update could be provided in a way where it would only work with the specific device so that it couldn't be used without a warrant on other devices. This would be no different from an accidentally introduced security vulnerability aside from it being locked to a specific device. things like apple's security chip may make this impossible if there is no way to update the firmware to introduce an intended vulnerability, but it would certainly be doable on older phones and probably androids. giving government the ability to break everyones encryption is of course a terrible idea, but encryption is not the only mechanism where a bypass could be introduced
|
# ? Feb 22, 2020 02:22 |
|
Shaggar posted:this is not quite the full story though, because its is only about encryption. specifically wrt unlocking a phone theres no actual need to break encryption. if you can provide an unlock code the os accepts it will decrypt the data for you. this means you could give the government a way to unlock the phone without giving them a way to break the actual encryption. The thing you are describing is 100% equivalent to "there's a backdoor in the encryption and the backdoor key is the same as the update signing key".
|
# ? Feb 22, 2020 02:38 |
|
ok so nobody should ever update their software ever because software updates are a backdoor?
|
# ? Feb 22, 2020 02:45 |
|
this is why iOS updates require the PIN, btw
|
# ? Feb 22, 2020 02:46 |
|
Shaggar posted:specifically wrt unlocking a phone theres no actual need to break encryption In practice this whole debate only concerns the case where the device uses fde with a key derived from the pin. Otherwise the govt can just yank the disk out and read the data easy peasy The theoretical firmware update thing then refers to removing artificial delays and "data gets nuked after 10 tries" logic, allowing the pin to be bruteforced
|
# ? Feb 22, 2020 02:54 |
|
it would seem that some phones decrypt and persist some data after boot given the lockscreen bypasses we've seen in the past. A locked device needs access to data to provide things like notifications and contact display when you get a text or call. it probably varies pretty widely, but to exclaim that its impossible for phone manufacturers to provide some form of access without totally breaking all devices is not entirely correct. also fwiw i don't think ios automatic updates require a pin, or atleast i don't recall entering one for the last few updates that it did totally on its own.
|
# ? Feb 22, 2020 03:10 |
|
shaggar, take a break from this thread, you're in over your head
|
# ? Feb 22, 2020 03:13 |
|
if you can actually explain how, i'll listen.
|
# ? Feb 22, 2020 03:33 |
|
Everything he's said makes sense though
|
# ? Feb 22, 2020 03:34 |
|
if the software manufacturer can, given a device in-hand, decrypt the device without having user's credentials, then the software manufacturer has a backdoor to break the device encryption like, that's the definition of "having a backdoor".
|
# ? Feb 22, 2020 03:35 |
|
theres a difference between providing a backdoor to a single device given a court order and adding a global backdoor to every device. the idea that it would be impossible for the phone maker to do this without granting access to every device is absurd and outright wrong.
|
# ? Feb 22, 2020 03:40 |
|
If the phone maker has the technical ability to do it for one device, they have the technical ability to use it for every device. There's no way for them to have the technical ability to do it when a court order exists and not otherwise.
|
# ? Feb 22, 2020 03:46 |
|
thats a result of the current state of software and hardware development, not an active intent by the device maker. in the future maybe they can make devices that wont install updates with security flaws, but until then they can legitimately comply with a warrant without introducing any risk to other users of those devices.
|
# ? Feb 22, 2020 03:52 |
|
The fact that the device doesn't know whether or not there's a legitimate court order to decrypt isn't a result of the "current state" of development, it's a fundamentally impossible thing to achieve. Either the device maker can decrypt all devices, or they can decrypt no devices. There's no world in which they are only technically capable of decrypting the ones that have a valid court order.
|
# ? Feb 22, 2020 03:55 |
|
oh so you're saying security is impossible so we should just backdoor all devices instead of trying to limit the impact. I got you. I think thats a bad idea, but I get it.
|
# ? Feb 22, 2020 04:01 |
|
i'm saying that it should be impossible to decrypt without the user's credential even if you're the device manufacturer and a valid court order exists
|
# ? Feb 22, 2020 04:03 |
|
thats not possible if you allow the device maker to update the software.
|
# ? Feb 22, 2020 04:03 |
|
or until we get much better software and hardware as I previously mentioned
|
# ? Feb 22, 2020 04:04 |
|
you can store the device encryption key in a non-updatable hardware module. you can require software updates to purge the key from ram and require re-unlocking from the hardware module. this is all stuff that's easily achievable today.
|
# ? Feb 22, 2020 04:06 |
|
Yes the basic problems is that numbers (keys and certs) are just numbers and can't be restricted to a certain person or organization, even tpm's are just numbers that are suppose to be possessed but not known to a user but you cant guarantee that. The best you could do is maybe use *BLOCKCHAIN* technology to at least of have an immutable public record of phones that have been unlocked.
|
# ? Feb 22, 2020 04:11 |
|
Jabor posted:you can store the device encryption key in a non-updatable hardware module. maybe its possible, but nobody has done it yet with any of these consumer devices. until then providing a per device backdoor is certainly doable and reasonable.
|
# ? Feb 22, 2020 04:12 |
|
Shaggar posted:until then providing a per device backdoor is certainly doable and reasonable. Thats not reasonable at all?
|
# ? Feb 22, 2020 04:41 |
|
mobile device can have a little back door, as a treat
|
# ? Feb 22, 2020 04:42 |
|
Shaggar I get that you're trying to distinguish "broken encryption" from "send to Apple, Apple flashes the firmware to bypass authentication," but it ends up being a distinction without a difference when you think through the realistic ways in which each would be implemented
|
# ? Feb 22, 2020 04:50 |
|
can't flash compromised firmware without rebooting, so the only thing the firmware could do to help break fde is disable retry timers
|
# ? Feb 22, 2020 04:59 |
|
ya that's one reason it's a distinction without a difference
|
# ? Feb 22, 2020 05:12 |
|
Achmed Jones posted:Shaggar I get that you're trying to distinguish "broken encryption" from "send to Apple, Apple flashes the firmware to bypass authentication," but it ends up being a distinction without a difference when you think through the realistic ways in which each would be implemented its totally different. in this scenario apple is not breaking encryption at all, they're granting access to data that they don't encrypt. things like contacts and texts and other things that sit in memory because they need to be accessible via the lockscreen. lockscreen bypasses get access to that data without having to decrypt it via a PIN and theres no reason apple couldn't grant access to that at a minimum via an intentional bypass. such a bypass would not ios any more than it is already compromised by apple's decision to promote convenience over security (the ability to access data while the screen is locked). creating a lock screen bypass for that data for a single phone does not reduce the security of any other phone since the basis for the exploit already exists on every phone. if we assume there are no intentional flaws in the normal ios delivered to all devices, then that data is still as secure as it was before apple released the custom flawed firmware for a single phone. if apple wants to fix this by always encrypting all the data when the phone is locked then they should do that and they should not be legally prevented from doing so. they wont ever do this though since it would make a locked phone far less convenient to use, even if more secure. as long as they do not change that then this data exists unencrypted on the phone, and this: Jabor posted:i'm saying that it should be impossible to decrypt without the user's credential even if you're the device manufacturer and a valid court order exists will be impossible.
|
# ? Feb 22, 2020 05:19 |
|
Shaggar posted:its totally different. in this scenario apple is not breaking encryption at all, they're granting access to data that they don't encrypt. things like contacts and texts and other things that sit in memory because they need to be accessible via the lockscreen. lockscreen bypasses get access to that data without having to decrypt it via a PIN and theres no reason apple couldn't grant access to that at a minimum via an intentional bypass. In a good design, that data only exists in transient storage, and would be purged (and need to be re-unlocked from encrypted storage) if the device is restarted for an update. Note that this is exactly how Android devices work - if you restart the device (for an update or for any other reason), you get nothing on the lock screen until the device has actually been unlocked that boot.
|
# ? Feb 22, 2020 05:24 |
|
I wish your posts were encrypted
|
# ? Feb 22, 2020 05:27 |
|
You're trying to argue "well the data already exists in memory so can't you just pull the RAM???" and if it does exist in memory, yes, you can. That's not what we're discussing. If the manufacturer decrypts your contacts on boot before you enter your passcode, that's a pretty massive design flaw, and I am skeptical that any modern design does this. Until the device is unlocked, the system should only display "missed call" or "message received" or some such. Non sensitive data such as "when are the upcoming alarms set for" can be stored unencrypted without issue, which is why yes, they'll work. Look, just pretend that "key that only the good guys have" is Linux and "device encryption is all or nothing by design" is Windows.
|
# ? Feb 22, 2020 05:48 |
|
Volmarias posted:You're trying to argue "well the data already exists in memory so can't you just pull the RAM???" and if it does exist in memory, yes, you can. That's not what we're discussing. If the manufacturer decrypts your contacts on boot before you enter your passcode, that's a pretty massive design flaw, and I am skeptical that any modern design does this. Until the device is unlocked, the system should only display "missed call" or "message received" or some such. Non sensitive data such as "when are the upcoming alarms set for" can be stored unencrypted without issue, which is why yes, they'll work. yeah I just tested this and until you enter your passcode iOS just doesn't know anything about contacts
|
# ? Feb 22, 2020 05:56 |
|
lol at autodecrypting contacts on boot
|
# ? Feb 22, 2020 06:15 |
the secfuck is coming from inside the thread
|
|
# ? Feb 22, 2020 06:30 |
|
Skim Milk posted:the secfuck is coming from inside the thread shaggared again
|
# ? Feb 22, 2020 06:47 |
|
Buff Hardback posted:yeah I just tested this and until you enter your passcode iOS just doesn't know anything about contacts cool with that we can end this dumb derail
|
# ? Feb 22, 2020 06:48 |
Is it worth mentioning that medical ID and the emergency contacts are available without supplying a PIN?
|
|
# ? Feb 22, 2020 09:39 |
|
|
# ? Apr 25, 2024 08:57 |
|
No. Those are things the user intentionally stores in a place that is accessible without unlocking, and there is, very intentionally, no need to break into anything to read them.
|
# ? Feb 22, 2020 11:51 |