Register a SA Forums Account here!
JOINING THE SA FORUMS WILL REMOVE THIS BIG AD, THE ANNOYING UNDERLINED ADS, AND STUPID INTERSTITIAL ADS!!!

You can: log in, read the tech support FAQ, or request your lost password. This dumb message (and those ads) will appear on every screen until you register! Get rid of this crap by registering your own SA Forums Account and joining roughly 150,000 Goons, for the one-time price of $9.95! We charge money because it costs us money per month for bills, and since we don't believe in showing ads to our users, we try to make the money back through forum registrations.
 
  • Post
  • Reply
EssOEss
Oct 23, 2006
128-bit approved
Is there a standard data format for encrypting a blob of binary data? I know of XML encryption for encrypting XML and of JWE but they both require binary data to be base64-encoded for representation (or some custom enveloping mechanism to stick the data "off to the side"). All I want is to encrypt some bytes and stick a binary header in front and to do it in a standard way that does not require any reinvention of wheels.

Adbot
ADBOT LOVES YOU

EssOEss
Oct 23, 2006
128-bit approved
Mozilla’s CA team has lost confidence in the ability of WoSign/StartCom to faithfully and competently discharge the functions of a CA. Therefore we propose that, starting on a date to be determined in the near future, Mozilla products will no longer trust newly-issued certificates issued by either of these two CA brands.

I hope this is a sign of things to come! CAs have gotten away with ridiculous poo poo for far too long.

EssOEss
Oct 23, 2006
128-bit approved

Cugel the Clever posted:

If not, what are the go-to solutions for sensitive, routine business communications?

What do you want to protect against? A state actor intercepting it? Probably not. A malicious employee stealing confidential data? That can be tricky - just ensure that sensitive data is behind lock and key and ACL (though not so much that it actually interferes with regular business). Some random 3rd party who uses a wide-spectrum attack? Ensure that your employees use strong passwords if accessing external services storing sensitive data, ideally multi-factor auth, or don't even keep that sensitive data externally. Idiot employees running trojans? Turn on Windows Defender and hope you are not targeted with a custom attack that's not in the signatures yet. Microsoft ATP or whatever it was called offers additional protection against that. (A...dvanced Threat Protection maybe?)

The threat model is a very important part of any security design, so there really is no right answer.

EssOEss
Oct 23, 2006
128-bit approved
We did some PCI relevant stuff with USD. They were not retards, though I cannot vouch for them any deeper than that as I was not too closely involved.

EssOEss
Oct 23, 2006
128-bit approved
LessPass is reinventing password managers! Leaking the password via displaying magic images is a cool innovation.

EssOEss
Oct 23, 2006
128-bit approved
Last I tried changing my PayPal password, I could not even paste a password into the textbox because they disabled paste...

EssOEss
Oct 23, 2006
128-bit approved
Reminds me of when I used to work on a government contract and the tax department happily furnished us with access to their testing database... which was a copy of the production database. Yep, everyone's tax details right there to touch, names and all included. I didn't even have to uniquely identify myself to access them.

The border patrol did the exact same thing. I wonder if this is common practice all over...

EssOEss
Oct 23, 2006
128-bit approved
Speaking of UAC, I today ran into the fact that starting a WinRM remote session in under some circumstances rejects you if you are a local admin that is not the Administrator user. The internet says to disable something called LocalAccountTokenFilterPolicy and then it works. Yep, sure does.

What is this policy, though? Even Microsoft's own documentation tends to leave it implicit on the pages where it is mentioned. I get the feeling it is somehow related to UAC but what exactly is this setting that I am disabling?

EssOEss
Oct 23, 2006
128-bit approved

Furism posted:

Is this just me imagining things or is it ok to store (non-confidential) files outside of my user's home?

As long as the ACLs on these directories are configured according to your needs and any software that you have running does not go looking for these files elsewhere, sure go right ahead.

EssOEss
Oct 23, 2006
128-bit approved
There is a "use PIM" option if you want to customize the number of iterations it does (and thereby make yourself less resistant to brute-force attacks).

EssOEss
Oct 23, 2006
128-bit approved

Furism posted:

Can't you load the private key in Wireshark and still decrypt it on the fly? Genuine question, as I've only done it with recorded HTTPS myself.

This is the attack that forward security prevents, right?

EssOEss
Oct 23, 2006
128-bit approved
Is your username something that they might confuse with their own? Maybe there is another Andrew Smash out there who does not realize he is trying to log into your account.

I would not bother doing anything about it, unless you mind the emails that much.

EssOEss
Oct 23, 2006
128-bit approved

ohgodwhat posted:

no idea how to improve my security.

Use a password manager, use unique randomly generated passwords for everything except the password database itself, use multifactor authentication via a physical token or Google Authenticator or equivalent. Having done all that, you might begin to feel safe from password theft (you can still get social engineered but not much protects against that).

Of course, the above has no relation to the Instagram event - I do not know what happened there but if you suspect your passwords are compromised, change them.

EssOEss
Oct 23, 2006
128-bit approved
I recommend KeePass with Google Drive cloud sync of the password database. FolderSync works great on Android for this (the Drive app sync was pretty broken last time I tried it). No browser integration, just auto-type and clipboard on PC and the KeePass keyboard on Android.

Turn off "Safe file writes" or whatever it is in KeePass options or sometimes Drive will think you deleted the password database instead of saving it (because it does a SaveAs->DeleteOld->RenameNew sequence).

Also disable the "press enter after typing password" default option to stop you publically tweeting your password in case you accidentally activate auto-type somewhere you should not.

EssOEss
Oct 23, 2006
128-bit approved

Thermopyle posted:

KeePass2Android syncs to Drive or Dropbox automatically, no need for another program to do it.

I remember I tried it but there was some reason I did not use the builtin stuff but I have totally forgotten what it was. Did it perhaps require network connectivity (it did not sync, just downloaded from Drive)?

EssOEss
Oct 23, 2006
128-bit approved
The crux of the matter is really that SSL rolls off the tongue far more easily than TLS. The latter is just uncomfortable to voice. Therefore, TLS shall be known as SSL until the end of days.

EssOEss
Oct 23, 2006
128-bit approved
That's a joke repo. The Git equivalent of a fork bomb.

EssOEss
Oct 23, 2006
128-bit approved
Estonian and Hungarian ID cards use Infineon RNG and are now compromised. So, uh, pay 50000€ to be able to brute force a legally binding signature of anyone whose public key you have. Nice.

EssOEss
Oct 23, 2006
128-bit approved
You can paste a key here to check it: https://keychest.net/roca
Another site is https://keytester.cryptosense.com/

Ars Technica says it is Estonia and Slovakia that are vulnerable (I misremembered the second one earlier). I did find the Portugese cards listed on Gemalto's website. As Gemalto was the provider of the Infineon-manufactured cards to Estonia, there is some cause to suspect a link here, indeed.

Double Punctuation posted:

Infineon are the guys who just got their TPM chips hacked. Pretty nice.

Yeah, the RNG vulnerability that affects the TPMs is the exact same as the one for the ID cards. In both cases, they generate RSA keys that are not as unpredictable as they should be.

EssOEss
Oct 23, 2006
128-bit approved
I might agree that Gibson is a bit opinionated but he provides tools and services that can be very useful. I will always respect him for providing a free port scan service in TYOOL 1999 when I was a young idiot kid who knew nothing but could at least scan his own ports thanks to Steve. The "Gibson is a fool" bandwagon is rather a short bus - don't get on it for no reason.

That being that, is there merit to the claim that DNS performance has a meaningful impact on real world internet usage? I would assume any DNS queries are cached, which makes it irrelevant for the vast majority of requests. Am I mistaken in this?

EssOEss
Oct 23, 2006
128-bit approved
The kingdoms invented police to deal with crime 800 years ago, let's not poo poo up infosec with the daily troubles of lovely people and their victims.

EssOEss
Oct 23, 2006
128-bit approved
Azure has started their mass updates and quite a lot of machines are failing to boot up after the restarts. Yikes!

EssOEss
Oct 23, 2006
128-bit approved
Edit: nevermind, I remembered wrong.

EssOEss
Oct 23, 2006
128-bit approved
What I do is disable the enter key at the end of the auto-type key sequence, so I can review exactly what box it stuck my password into.

EssOEss
Oct 23, 2006
128-bit approved

22 Eargesplitten posted:

I know the answer is wipe and reinstall if you think there might be a virus, but if I can’t convince someone to do that, what’s the next best thing? I’m taking a look at my neighbor’s dad’s computer in exchange for a tow my neighbor gave me. He thinks there’s a virus.

He might have adware or just a PC full of random poo poo but the probability of a random user being able to actually detect a virus on their PC is near nonexistent. Why does he think he have a virus? There is probably something else wrong and you should focus on determining why he is concerned and tracking down whatever its root cause is, instead of focusing too much on his speculation of evil hacker viruses.

EssOEss
Oct 23, 2006
128-bit approved
My general tidy-up flow for relatives etc is very simple. If there is something undesirable-but-not-outright-virus running on the PC, the crucial bit is that it has to run to do anything. This means that it either exists as its own process or as a plugin in some other process.

So I just go through all browser plugins, removing any that seem suspect, and then go through all running and autostarting processes, removing any that are suspect from disk and autostart. That's it - last time I encountered something that needed anything more in depth than this on a personal device was over 15 years ago.

Sysinternals procexp and autoruns are good tools for this. It does rather require you to have a good feel for what is expected and what is not, though.

EssOEss
Oct 23, 2006
128-bit approved
When you target the lowest common denominator, you got to take it reeeeeal low. Do you really think Facebook would allow its users to get locked out by being too security-conscious? No way.

EssOEss
Oct 23, 2006
128-bit approved

anthonypants posted:

It's hosted on Sourceforge though?

Sourceforge is good again, or at least not intentionally evil.

EssOEss
Oct 23, 2006
128-bit approved
Authenticode code signing certificate providers these days all want to sell me dongles that require me to mash a button or provide a password every 24 hours for code signing. I want to sign code in my build servers in a data center, with no human presence.

Where should I turn to for such capabilities? Do the certificate providers offer it? All I see in web searches are dongles. I would be totally happy with, for example, TPM-locked certificates that are equally secure against theft as physical tokens. I just do not want to have a human sitting in my server rack.

EssOEss
Oct 23, 2006
128-bit approved
Welcome to General Data Protection Regulation, enforced from 25 May 2018. It takes data protection up to eleven across the EU.

GDPR is far more wide-reaching than cookies but here is a decent overview of what it means for cookies:

quote:

Implied consent is no longer sufficient. Consent must be given through a clear affirmative action, such as clicking an opt-in box or choosing settings or preferences on a settings menu. Simply visiting a site doesn’t count as consent.

‘By using this site, you accept cookies’ messages are also not sufficient for the same reasons. If there is no genuine and free choice, then there is no valid consent. You must make it possible to both accept or reject cookies. This means:

It must be as easy to withdraw consent as it is to give it. If organisations want to tell people to block cookies if they don’t give their consent, they must make them accept cookies first.

Sites will need to provide an opt-out option. Even after getting valid consent, sites must give people the option to change their mind. If you ask for consent through opt-in boxes in a settings menu, users must always be able to return to that menu to adjust their preferences.

I do not think it affects individual cookies but it does mean that users need to be explicitly informed about what (exactly!) their data is used for. Just "we use cookies to improve our service" does not cut it anymore.

EssOEss
Oct 23, 2006
128-bit approved

BangersInMyKnickers posted:

Yeah, any of the CAs can do this with a standard signing cert.

I am not entirely sure how this matches with the rest of your reply where you indicate that some elaborate workflow should be set up.

For sure, I appreciate and approve of the need to lock down the private keys because devs are dumb. But I want my automated builds to produce me a new signed copy of my app on every commmit, even if that happens every 5 minutes, without any user interaction.

As far as I can tell, this is not possible with the mainstream code signing certificates, which require a dongle with a password that needs manual entering or a physical button that needs pressing. Can you link me to any code signing certificate service that can just install a certificate onto a server (I am find with it being in a hardware dongle or TPM) that does not need a human to take action to sign code?

EssOEss
Oct 23, 2006
128-bit approved

ElCondemn posted:

You can definitely just get a signing cert without any dongles or passwords.

Can you link to a specific provider? Because Section 16.3 of some relatively ontopic industry specifications say that's not kosher these days.

Granted, I do not particularily care about the level of security - I can deal with plain certs with no protection or I can deal with well-protected certs (TPM/HSM style) but what I do not want at all is some certs that require a human to sit in my server rack, so to speak.

So far, searching the web has given me either "I sell you a dongle with manual authentication" providers or others who have websites last updated for Windows 98 (so possibly suitable - I need to call a few of these up).

ElCondemn posted:

But I’m a bit wary of this method, how often are you releasing builds to the public? Only your GA public releases should be signed, you shouldn’t automatically sign every build that comes from your build pipeline.

Why not? Signing proves that the builds come from me. My builds all come from me, even those I choose not to publish to a wide audience. Therefore it makes perfect sense to sign them. What makes you say I should not?

EssOEss
Oct 23, 2006
128-bit approved

ElCondemn posted:

As far as I know there is no signing technology that requires anyone to sit anywhere physically, they just need access to the token and the code you're signing.

My problem is more with the "person" side, not specifically about where they sit. I want my workflow to be automated.

Thanks for the link. Comodo sounds like it might potentially offer what I need, indeed!

EssOEss
Oct 23, 2006
128-bit approved

Wiggly Wayne DDS posted:

do you plan on any auditing process for this auto-signing process at all or is it too much hassle for you? were a malicious executable made from that process how long would it take you to notice

No auditing - that would indeed be too much of a hassle. If someone infiltrated the system and got my build process to sign their malicious code, I doubt I would ever notice (maybe if Windows Defender catches it by coincidence during the signing process). I accept this risk.

apseudonym posted:

Signing with your release keys is more than just this came from you, it also implies you're OK with it being installed anywhere and everywhere.

I do not accept this definition. A signature says who the code came from, that is all. What is the logic here? If you draw other implications from this signature, your security model is a bit dubious (though I can accept drawing negative implications from a *lack* of any accepted signature).

EssOEss fucked around with this message at 09:26 on Feb 28, 2018

EssOEss
Oct 23, 2006
128-bit approved
You have a private key that corresponds to this certificate.

EssOEss
Oct 23, 2006
128-bit approved

Space Gopher posted:

A code signing cert is supposed to say that the signing organization has tested and validated a given release. You might not "accept this definition," but the rest of the industry does.

Oh, I see what you mean - it is the equivalent of the lock icon on the address bar that tells you the website is trustworthy, right? That makes a lot of sense. Code signing says the exe is known good, just like seeing the lock icon means it is safe to enter my passwords onto that website.

EssOEss
Oct 23, 2006
128-bit approved

Daman posted:

ya you're being sarcastic but actually yes, the green lock happening when you go on google.com means they absolutely trust everything that's getting sent to your browser.

just like when you do code signing, you have to absolutely put your company's name behind that signed binary being your product.

consumers don't care about fuckups, sure, but you're a lovely company if you don't try to avoid fuckups.

I agree absolutely - digital signatures are there to prove who something came from. That's no the claim that was made in the above discussion, though, which was that having a digital signature means that software is "tested" or "validated" and implies something positive security-wise about it.

This is, however, absolute fantasy - just like a lock does not mean to the user that you can shove your password onto the website, signed code does not make it secure. What is trusted is the identity, not the fact that something is signed. You should not install drivers signed by Beanie Babies LLC just like you should not put any passwords into https://facebook.notascammer.ipromise.ru even if it has a pretty green lock.

EssOEss
Oct 23, 2006
128-bit approved

Space Gopher posted:

The point of a signed binary release is that it says, "this is a legitimate piece of software put out by EssOEss; if you trust that person/company/OU, then you can trust this software."

I agree.

Space Gopher posted:

What you want to do is turn that statement into, "this came from my automated build pipeline, gently caress if I know what's in there, but good luck."

This seems to be an exaggeration. Of course I know and trust what is in there - why would I not? The mere fact that I do not want to implement some bothersome "user has to manually unlock signing key every 24 hours" process or some bureaucratic auditing scheme (that would become a pointless formality once the person doing it gets fed up) does not mean that my build pipeline is suddenly filled with malware.

I accept the risk that if that happens, it would be hard to notice fast enough, but that does not mean it is going to happen. Indeed, a large part of the reason I accept the risk is that the probability is almost infinitesimally low.

Space Gopher posted:

The equivalent in a web context is Facebook allowing people to deploy random poo poo straight from source control to a public-facing server with a *.facebook.com cert and key.

Yes, that seems to be more or less a fair analogy. Facebook's threat model is obviously very different, so as it is perhaps not a very useful parallel but I can see the similarity in principle.

Jabor posted:

Honest question, why do you actually want to have these builds signed?

Most importantly, app store requirements that require code signing.

Second, some of the signed code are PowerShell scripts and PowerShell in certain configurations requires scripts to be signed.Mostly people seem to just disable that requirement, but I try to do what I can to help people avoid disabling security features to get on with their job.

I also consider it general good practice to identify cryptographically who binaries originate from, so I would do it even without the above requirements if it were simple enough (which it mostly was, until GlobalSign announced they require some human action related to a token to actually do signing).

Frivolous Sam posted:

Having proper processes so you know only good code gets signed by you makes you trustworthy. Otherwise everyone should be telling your users not to trust anything signed by you and what's the point of the certificate?

Sure. However, proper processes can be "keep the system updated and do not allow random people Git commit access" and similar. They do not need to include overburdened auditing or "have to literally press button on USB token plugged into back of server" steps.

EssOEss fucked around with this message at 09:08 on Mar 1, 2018

EssOEss
Oct 23, 2006
128-bit approved
* Something you know
* Something you own
* Something you are

Those are the standard 3 components for authenticating yourself. The keyfile is used as the "something you own" component, with your password as "something you know". You need to actually control its ownership for it to be any benefit - stick it on a USB stick that you carry along (plus perhaps a backup in the cloud encrypted with a password written down in your safe at your summer home).

Adbot
ADBOT LOVES YOU

EssOEss
Oct 23, 2006
128-bit approved
The logic feels sound - I would also expect that to be the case. It has been some time since I experimented, though, so perhaps I fail to recall something obvious. Are you on a corporate network? There are also network unlock capabilities in BitLocker, which might restore the key automatically.

Note that using the TPM is one of many possibilities to configure BitLocker and is not the default if I remember it correctly. You can explore the situation using the Get-BitlockerVolume PowerShell command and its siblings. "TpmProtector" is the configuration where it uses the TPM to protect your keys.

  • 1
  • 2
  • 3
  • 4
  • 5
  • Post
  • Reply