|
ryanrs posted:The SX1276 can ID in morse code if you switch it to FSK modulation. Do techie vikings use Norse Code?
|
# ? Dec 4, 2024 00:52 |
|
|
# ? Jan 26, 2025 02:30 |
|
ryanrs posted:The unlicensed jungle of 915 MHz ISM! are you allowed to use encryption on that band
|
# ? Dec 4, 2024 13:30 |
|
I did 04 silver also without regex, by iterating over the indexes. With optimizations I could barely get it to run as fast as regexp. Seems regex FTW.
|
# ? Dec 4, 2024 13:58 |
|
i wanna solve 04 gold with just regex
|
# ? Dec 4, 2024 14:19 |
|
ryanrs posted:Yeah, that's right: I'm flipping the AES primitives around, using AES ECB decrypt to produce cipher text, and AES ECB encrypt to decrypt it. I think this is ok for AES, and maybe for all secure block ciphers? What's the name of this property so I can google it? The one notable example I can think of is 3DES, which is defined as three rounds of DES: encrypt (K1), decrypt (K2), and encrypt again (K3). Apparently it was found that using the decrypt function in round 2 provided greater security with 112-bit keys (where K1 = K3). The only issue with AES is that decrypt can be a slower operation than encrypt since the key schedule has to be used in reverse, and on a memory-constrained microcontroller or in a hardware implementation, that may require recomputing the entire key schedule each round. This may be the reason why your micro doesn't support hardware decryption (in addition to not being necessary for commonly-used modes these days). ryanrs posted:Running AES backwards suits my application because it is very decrypt-heavy (since it doesn't know who sent the packet before decrypting it, so many trial decryptions that fail). Personally I'd just use CTR mode. If you don't like the idea of sending an obviously-sequential nonce with each packet you could use a second AES CTR instance to generate CSPR nonces using a node-internal counter. It would double the number of AES operations you have to do for encryption (in hardware!), but uses the same AES CTR decryption mode. ExcessBLarg! fucked around with this message at 15:23 on Dec 4, 2024 |
# ? Dec 4, 2024 14:59 |
|
vanity slug posted:i wanna solve 04 gold with just regex How do you do 2D regex? I considered creating substrings of 9 from each index and joining them with space, but that would take 9x more memory. Oh no..
|
# ? Dec 4, 2024 15:08 |
|
ExcessBLarg! posted:So this seems, fine, but why do you really want to use ECB anyways? It has the really obvious flaw that you're going to keep sending the same ciphertext whenever your sensor updates don't have updated payloads--unless, I guess, you simply don't send them at all. My code isn't straight ECB, but uses the ECB primitive. But I think I am using ECB in a broadly similar way that CTR uses ECB. Specifically, my 16-byte plaintext blocks never repeat. Each block is guaranteed unique (for the duration of the session key). I'm trying to keep these update packets as small as possible, and independently decrypt-able (since there is ~10% packet loss). I think 16 bytes is as small as I can go? If I go smaller, besides issues with block size, there are big problems distinguishing failed decryptions or even just noise. I don't need the certainty of a 16-byte HMAC, but a 4 byte magic number is probably not enough. I can live with 1 byte of sensor data, although my current code sends 4 bytes. The other 12 bytes have other header fields, including a 32-bit sequence number. I create a new AES session key every boot (and occasionally during runtime), so the sequence numbers will not repeat with the same key. The benefits of this weirdo mode disappear once you are sending more than 1 or 2 blocks per packet. At that point just use CTR or some other standard mode.
|
# ? Dec 4, 2024 18:33 |
|
Captain Foo posted:are you allowed to use encryption on that band It would be rude to poo poo up 70cm, so I'd probably do my business on 460 MHz.
|
# ? Dec 5, 2024 00:14 |
|
Captain Foo posted:are you allowed to use encryption on that band encryption? why no, officer, this is just powerful narrowband noise
|
# ? Dec 5, 2024 00:53 |
|
All this unlicensed radio band chat is making me wanna watch pump up the volume
|
# ? Dec 5, 2024 01:12 |
|
ryanrs posted:Example update packet, which should be short: To elaborate on the 'verification bits' idea, below is my packet acceptance code. It rejects any packet with implausible values. code:
code:
The question is, can an adversary manipulate a ciphertext to preferentially affect certain bits in the plaintext? For example, given a valid encrypted packet copied off the airwaves, can an attacker modify the ciphertext in such a way to affect only the last 4 bytes in a block, and not the first 12? Since my algorithm only verifies certain bits, not a full hash of the message, such an attack could target my information bits and avoid the verification bits. Mess up the sensor value without spoiling the sequence number. I know good block ciphers try to maximize the avalanche effect. And I think if the above approach was practical, it would also break AES CTR, since CTR usually only flips a bit or two of the AES input when the counter increments. Note that CTR mode does not create this error avalanche, at all, because it's an XOR. You absolutely need a separate HMAC with CTR mode, since an adversary can trivially flip a plaintext bit just by flipping the corresponding ciphertext bit. It's a 1-to-1 mapping, with no scrambling at all. My code really, really needs that scrambling.
|
# ? Dec 5, 2024 07:32 |
|
What's your threat model again?
|
# ? Dec 5, 2024 09:10 |
|
ryanrs posted:The question is, can an adversary manipulate a ciphertext to preferentially affect certain bits in the plaintext? the answer to this question in isolation is “no”. there is no way to do this unless you have the key. your system as described is adequately secure in that no attacker, however sophisticated and however much they love math, is going to start by breaking the encryption.
|
# ? Dec 5, 2024 09:34 |
|
ryanrs posted:The question is, can an adversary manipulate a ciphertext to preferentially affect certain bits in the plaintext? But really, the conclusion over the past two decades is that if your thread model includes potential adversarial manipulation of the ciphertext, you need AE/a MAC, and (as you mentioned), having such in place is what makes CTR mode/XOR stream ciphers acceptable. So why can't you do a MAC again? It doubles the size of your update packets? Lack of compute? ryanrs posted:I know good block ciphers try to maximize the avalanche effect. And I think if the above approach was practical, it would also break AES CTR, since CTR usually only flips a bit or two of the AES input when the counter increments.
|
# ? Dec 5, 2024 12:52 |
|
Quackles posted:What's your threat model again? Joe in accounting who's 60 and Matt the 19 year old intern in marketing
|
# ? Dec 5, 2024 13:26 |
|
What ciphers are used commercially for very short packets, like 10 or 20 bytes including overhead? I knew about KeeLoq, the (bad) cipher used in automotive keyless entry systems. Now that I'm reading through the details, I see many eerie similarities to my scheme. Microchip: Introduction to Ultimate KEELOQ Technology What other publicly-described systems can I read about?
|
# ? Dec 5, 2024 18:23 |
|
General-purpose RPC server running on the sensor nodes, lol.code:
I'll probably have the nodes check for activity every few minutes, and if they see a paging signal, go into interactive mode, with a multi-hour inactivity timeout. Power usage is 2 mA @ 3.7V, running a 64 MHz Cortex-M4.
|
# ? Dec 12, 2024 04:54 |
|
TDMA is the usual MAC for these sorts of simple star topology sensor applications isn't it?
|
# ? Dec 12, 2024 05:19 |
|
Because of my power budget and small number of nodes, there's not much point in thinking about contention. Just pretend the channel is always free.
|
# ? Dec 12, 2024 05:31 |
|
Yeah, that's the point. Sync to the base station beacon packet and you can crank down your node radio's duty cycle as low as you want. Wake up just long enough to receive the scheduled beacon and then wake up again for your TDMA slot if the beacon says you have a message waiting (or you have a message to transmit). It's easier if your radio has a dedicated enable pin that you can hook up to a timer on the MCU but it's still possible without.
|
# ? Dec 12, 2024 05:37 |
|
I'm not sure what the beacon gets me. Instead of sending a beacon packet, just send the packet.
|
# ? Dec 12, 2024 06:03 |
|
you've weighed the beapros and beacons
|
# ? Dec 12, 2024 06:15 |
|
There is a state machine that controls the radio (written in Lua). It can tune the radio, check for a carrier, receive packets, and transmit packets. If you want, you can monitor several frequencies, and if you're fast about it, catch a packet being sent on any one. You just need to configure the transmitter to send a giant preamble that is longer than the time it takes for your receiver to go through its entire channel list. Kinda gross if you get carried away with it, though.
|
# ? Dec 12, 2024 06:16 |
|
Quackles posted:you've weighed the beapros and beacons
|
# ? Dec 12, 2024 15:09 |
|
Maybe I'm misunderstanding your use case then. I'm assuming your base station has "sufficient" power available and can keep its radio on constantly, your nodes are battery limited, and every node can directly talk to every other node (or at the very least every node can talk directly to the base station). Back when I did wireless sensor projects, the radios on the nodes would consume significant power if they were constantly receiving, so the idea is to run the receiver on a low duty cycle. Transmit a beacon packet at regular intervals, say 10ms, then once a node receives its first beacon it synchronizes a timer to that beacon and switches on its radio just long enough to receive the beacon at the expected time. Then each node has its own TX and RX time slots allocated in the time interval (frame) between beacons, so you don't get collisions. If you have something to report to the base station then you schedule the radio to transmit during your slot. If the base station has something to say to you then it will announce that in the beacon's contents, so you'll switch on your receiver for the duration of your receive slot as well. That way your radio spends most of its time asleep and therefore reduces its power consumption, while still exhibiting reasonably low latency if your frames are short enough. Now, this was just a masters project but it's not a particularly complicated MAC, I'm sure real products aren't that much more complicated.
|
# ? Dec 12, 2024 15:27 |
|
I'm doing a star network, where the nodes only talk to the base. Rough numbers: 10ms to check for channel activity (DSP math to detect below the noise floor) 100ms to receive a minimal 16-byte packet 1,000ms for full size 200 byte payload + 40 byte crypto 244 bytes/sec raw radio rate ...and I just realized I didn't mention that the nodes each listen on a different frequency. Beacons would make a lot of sense if they were all on the same channel. For the uplink, all nodes transmit on the same freq. The base runs its radio at high duty cycle, so nodes transmit with minimum preamble. If I had high channel utilization, collisions would be a problem. But my channel utilization should be <<1%.
|
# ? Dec 12, 2024 16:36 |
|
Hmm, yeah if I had 100+ nodes, the current design would suck, esp with thundering herd issues on the uplink. And more broadly, I am not getting good performance in terms of samples/sec/mW across the network. OTOH, 100 devices would require a hardware re-spin and contract manufacturing, at minimum. And a bigger microcontroller for the base, because Lua couldn't track a hundred nodes with the current RAM. So it's not as if a smarter protocol would actually let me deploy 100 nodes. (I don't want to manually recharge a hundred nodes, either.)
|
# ? Dec 12, 2024 16:48 |
|
sheesh, I'm really leaning on "it's ok if it's bad and dumb because it's a hobby project"
|
# ? Dec 12, 2024 16:53 |
|
ryanrs posted:sheesh, I'm really leaning on "it's ok if it's bad and dumb because it's a hobby project" well its true we are gonna get some contractors to write us a new frontend instead of the 9yo angularjs hunk of coal, so ive been documenting the API & lol, its basically already a hack of an earlier system so im making real sure to write "do not use any fields in these json objects unless they are documented in this document" so it doesnt calcify further
|
# ? Dec 12, 2024 18:47 |
|
I'm also seeing the limits of Lua. You can't shove nearly as much functionality in 256k of RAM with Lua as you can with C/C++. Maybe 1/10th? That's obviously a very hand-wavey number, but a 256k microcontroller is a big chip, yet it feels pretty cramped. But hacking together interactive UI stuff and state machines with Lua coroutines is so much easier and more fun than writing a huge C event loop. It's a good tradeoff for my hobby projects, but for a commercial product, that performance/RAM/functionality/$ tradeoff is bad. It is better to have your devs spend more time beating themselves in C so you can ship a cheaper product with a smaller micro. The thing that would swing the economics in favor of Lua, even for some commercial projects, is if microcontroller RAM was dominated by something else, something so big that Lua's memory inefficiency didn't matter. For example, graphics. Smartwatches that are too small for Linux, but have graphics, will need more RAM. Edge AI (if it's not bullshit) might also drive larger microcontroller RAM.
|
# ? Dec 12, 2024 19:03 |
|
What I am getting out of this Lua framework is moving all my dumb little projects onto properly managed, shared codebase. 1. Nordic toy project (USB I/O device) 2. RP2040 toy project (4-key macropad) 3. Nordic complex project (this sensor network) 4. Nordic complex project (bear radar device) ... 7. LED sign project, not Lua. The LED sign needs to be integrated into the MQTT network somehow, though the Teensy 3.2 microcontroller is too small to run Lua in addition to LED Shader Language (created for this one project as a joke, but turned out to be fun). So that's the goal: all my dumb projects, with up-to-date code, alive and running, even if it's just displaying the time or announcing my mail.
|
# ? Dec 12, 2024 19:30 |
|
kinda just realizing now how many of my projects are ever more elaborate ways of blinking leds
|
# ? Dec 12, 2024 20:01 |
|
ryanrs posted:But hacking together interactive UI stuff and state machines with Lua coroutines is so much easier and more fun than writing a huge C event loop least unhinged rust fanboy rn:
|
# ? Dec 12, 2024 20:13 |
|
that's literally lua's whole deal, is when C people were like "gui in C sucks so loving bad im making a new language about it"
|
# ? Dec 12, 2024 20:32 |
|
I don't know enough about Rust to get it, sorry. But do tell me about Rust. As a compiled language, it probably fits onto a microcontroller better than Lua, once you start writing a lot of code? Lua is amazing for interfacing with terrible Arduino C++ code. How is Rust in this regard? I've heard Rust is designed for systems programming, but also that it was pedantic and bitchy about...borrowing things? I dunno!
|
# ? Dec 12, 2024 20:32 |
|
gonadic io posted:that's literally lua's whole deal, is when C people were like "gui in C sucks so loving bad im making a new language about it" I also considered python, tcl, and javascript.
|
# ? Dec 12, 2024 20:35 |
|
ryanrs posted:I don't know enough about Rust to get it, sorry. it’s not my job to educate you about rust
|
# ? Dec 12, 2024 21:15 |
|
missed scrivener opportunity imo
|
# ? Dec 12, 2024 21:16 |
|
Another nice thing about Lua is that it's super easy to embed in a C++ program, even if you don't know any Lua at all. I know that sounds like an exaggeration, but it's not. In fact, I still barely know Lua. I have to look up how to write a for loop when I need one. I just learned about the obj:func() syntax yesterday. Somehow it doesn't matter, and I find writing Lua is faster for me than C++.
|
# ? Dec 12, 2024 22:54 |
|
|
# ? Jan 26, 2025 02:30 |
|
it's literally the whole point of the language. it was created designed and implemented specifically to exist only to embed into c++ programs.
|
# ? Dec 12, 2024 23:04 |