|
No. 1 Juicy Boi posted:Or would they still need to connect to the VPN to sync that part? This. There is no good solution for this for hybrid environments, you either need always-on vpn setup or to have your devices be 100% azuread
|
# ? Mar 16, 2021 16:02 |
|
|
# ? Apr 25, 2024 23:11 |
|
Yeah you either want to join your devices to Azure AD and then do the work to get SSO to on-prem resources working, or set something like DirectAccess up to replace your VPN. I don't know how hybrid Azure AD devices act regarding password changes off-network, but the impression I've always got about hybrid is to try and avoid it.
|
# ? Mar 16, 2021 16:36 |
|
If you change a password on a device off the network, you will use the new password on that device until it syncs up. When it talks to a domain controller next, you'll get a little pop-up notification asking you to lock your machine so passwords sync. It's not great and should be avoided.
|
# ? Mar 16, 2021 16:58 |
|
We're currently using Bitlocker to encrypt the OS drives of about 100 laptops and desktops. The current setup is to use the TPM and also to require the user to input a PIN at boot. The PIN is the same for each computer (lol) and is stored in AD. BIOS is password protected on all the machines, and USB booting is disabled. Network boot is password protected as well. SecureBoot is also enabled. I'm trying to convince my boss that using just the TPM alone should be adequate protection and would improve the end user experience, but he's insistent that "2 passwords is more security than 1! A hacker would have to guess both of them!". He did relent and say that if I could come up with a compelling reason or data that shows the TPM alone is adequate, he could be swayed to change his mind. I'm thinking that having a good password policy and setting the # of incorrect login attempts before locking the account to a reasonable number would stop the overwhelming majority of brute force attempts and be just about as useful as a Bitlocker PIN that is widely known amongst users (and is very often written down and kept with the computer). I know that there's no 'perfect' solution to this and we can sit around and envision various attacks that could theoretically unlock the drive. Anyone else have any information that I can use in support of my case? Or am I wrong here?
|
# ? Mar 19, 2021 00:59 |
|
Does anyone here have experience with Dell Wyse terminals?
|
# ? Mar 25, 2021 11:14 |
|
My boss wants me to make a script that will rebuild a Linux system on failure. One part of that is recreating a SSH account to connect to other systems. That would involve a SSH private and public key being stored somewhere. I'm trying to avoid checking in a private key into Git of course. Is Conjur any good as a secrets storage system? I'm been looking at it, but it seems like yet another under documented piece of open source software, and I don't want to invest the time in learning it unless it's worth it..
|
# ? Mar 25, 2021 14:57 |
|
bolind posted:Does anyone here have experience with Dell Wyse terminals? Some and experience is about 3-4 years old. They weren't bad, probably better than the HP thin client, we didn't have a ton of hardware problems from what I recall. The configuration can be a bit of a pain in the rear end unless all your devices are going to be configured the same. Brain is a little fuzzy on the details now but I was using their configuration utility to drop a basic config on them and then we'd manually configure the connection based on the location the device was shipped to, but I think when the device rebooted the connection we created would disappear because it would reload the configuration you initially dropped on it and wipe out any changes you made afterwards. I think that's what the issue was, there might have been a FW update that fixed this or something, as we eventually started using only them for a bit, then I think got a better deal with someone else and switched again. Do you have specific questions?
|
# ? Mar 25, 2021 15:01 |
|
bolind posted:Does anyone here have experience with Dell Wyse terminals? hail satan actually I like them compared to a few others, if only their management/patching plane wasn't obviously stapled together over 10 years) what do you need
|
# ? Mar 25, 2021 15:07 |
|
Hey here's a weird "small shop" problem I happened upon yesterday. All of a sudden several users were complaining of weird slow down issues with Chrome (that was not evident in any other browsers). It was everything from very slow responses when ticking boxes in Quickbooks online, to taking 5-10 seconds for a contact to autofill in the gmail interface. Turns out it was the updated Rakuten extension. Turned it off on all the machines and they jumped back up to normal. I was also able to replicate this on my home machine on a completely different network. Anyway. Fun weird issue to keep an eye out for.
|
# ? Mar 25, 2021 15:30 |
|
IUG posted:My boss wants me to make a script that will rebuild a Linux system on failure. One part of that is recreating a SSH account to connect to other systems. That would involve a SSH private and public key being stored somewhere. I'm trying to avoid checking in a private key into Git of course. I wouldn't reuse the private key. Regenerate a new keypair on build, and have the new public key added where it needs to be. That has some assumptions about the other systems you are connecting to though.
|
# ? Mar 25, 2021 16:13 |
|
BonoMan posted:Hey here's a weird "small shop" problem I happened upon yesterday. All of a sudden several users were complaining of weird slow down issues with Chrome (that was not evident in any other browsers). Extensions are evil.
|
# ? Mar 25, 2021 16:20 |
|
MF_James posted:Some and experience is about 3-4 years old. Potato Salad posted:hail satan actually I like them compared to a few others, if only their management/patching plane wasn't obviously stapled together over 10 years) Cool! So first let me say that I'm not married to Wyse, but we are a Dell shop, and so far it looks like they could work. The problem I'm trying to solve is that, currently, we have about a dozen "workstations" which are Dell SFF PCs of various vintages running Linux, that basically act as a glorified X canvas. User starts a terminal, immediately sshs to more powerful server. Same with most other programs. In fact, I don't think my users are smart enough to distinguish between, say, a browser window running locally and one running on the server. This is, obviously, a medium pain in the rear end, so I got the idea of scrapping them all and getting some thin clients to hook up to a VNC server. It's very local, the thin clients would literally have gigabit connection and sub half milisecond latency to the VNC server. I realize most people hook them up to Windows or something, but I do see in the docs that they support VNC.
|
# ? Mar 25, 2021 16:34 |
|
Guy Axlerod posted:I wouldn't reuse the private key. Regenerate a new keypair on build, and have the new public key added where it needs to be. That has some assumptions about the other systems you are connecting to though. That's a good idea, thanks. I'm going to do that instead for this case. But is there any recommendations for secrets storage? There are other text files on servers I'm supposed to check into git that have some passwords that I don't want there. Conjur seems like the answer to this, but the installation instructions are just "lol use our Docker walkthrough demo setup".
|
# ? Mar 25, 2021 16:52 |
|
bolind posted:Cool! Haven't used them for anything other than RDP to a Windows Server, so I can't attest to the quality of VNC connections. The devices themselves were fine enough, better than HP (we had a lot of hardware issues with HP TCs) and at least on par with most other TCs we tried. I would assume as long as the VNC integration isn't poo poo they should be fine for what you're doing but again, no experience in that space so hopefully potato has some.
|
# ? Mar 25, 2021 16:54 |
|
VNC is a little painful. out of curiosity, what kind of budget are you looking at
|
# ? Mar 25, 2021 17:11 |
|
Potato Salad posted:VNC is a little painful. out of curiosity, what kind of budget are you looking at Hahahahahaha...... No, seriously, the thin clients themselves, obviously, and then I don’t think we’re above throwing down some coin for NoMachine, RealVNC or similar. We’ve been getting by on whatever the stock, free VNC server in CentOS is, so far.
|
# ? Mar 25, 2021 17:17 |
|
is this is the kind of environment where there is zero willingness to pay licensing fees, so you require as good a free solution as possible? do you have any centralized authentication infrastructure set up
|
# ? Mar 25, 2021 17:23 |
|
Potato Salad posted:is this is the kind of environment where there is zero willingness to pay licensing fees, so you require as good a free solution as possible? Not zero willingness, we’ve specifically discussed buying a commercial vnc server if it benefits performance. That being said, we’re pretty strong on FOSS (sometimes to a fault.) Yes, we have a pretty solid FreeIPA setup.
|
# ? Mar 25, 2021 17:31 |
|
bolind posted:Hahahahahaha...... RealVNC is an excellent VNC implementation. Their RealVNC Viewer app is the best non-Apple VNC client available, and you'll need it to connect to a RealVNC Server with the encryption turned up to max. The Real licensing is in two models. In one, you pay per host running Server. In the other, you put server on every machine you have and pay per helpdesk tech who can then use it for remote support.
|
# ? Mar 25, 2021 18:00 |
|
IUG posted:My boss wants me to make a script that will rebuild a Linux system on failure. One part of that is recreating a SSH account to connect to other systems. That would involve a SSH private and public key being stored somewhere. I'm trying to avoid checking in a private key into Git of course. Look in to SSH certificates. Then you can issue certs from an internal CA that can allow new systems to properly authenticate to old ones and vice versa without having to pregenerate anything.
|
# ? Mar 25, 2021 20:17 |
|
Okay. This is a very weird question, and almost inconceivable unless there's a commonly used library for it that recently got hosed up, but... ...Has it been anybody else's experience lately that "reset/I forgot my Password" functions aren't working? Across multiple unaffiliated services? In both personal and professional affairs? On different recipient email servers?
|
# ? Apr 21, 2021 16:38 |
|
I haven't been able to login to MiLB.com, and their reset password email never delivers. It's been that way for months, and I want to do some baseball this summer now that we'll be vaxxed!
|
# ? Apr 21, 2021 16:54 |
|
Anyone using M365 MFA with azure ad/windows virtual desktop? Vendor wants to set us up with a third party mfa / app but I suspect the built in M365 offering would be good enough
|
# ? Apr 21, 2021 17:11 |
|
mewse posted:Anyone using M365 MFA with azure ad/windows virtual desktop? Vendor wants to set us up with a third party mfa / app but I suspect the built in M365 offering would be good enough Yes, but only on the first time the you log in via the client app or the first time you log in on that device via the browser. Azure Conditional Access, but yeah, similar thing. What's the use case that the vendor is saying it won't satisfy?
|
# ? Apr 21, 2021 17:18 |
|
Internet Explorer posted:Yes, but only on the first time the you log in via the client app or the first time you log in on that device via the browser. Azure Conditional Access, but yeah, similar thing. I haven't gotten into it with them yet, we're just starting to look at mfa because of an insurance questionnaire for cyber coverage. The same vendor supports my former employer and the solution they're using is duo mfa. It looks like it supports hardware tokens which would appease* my staff that don't have employer-provided smartphones. *not really appease just counter "i'm not putting that app on my personal device"
|
# ? Apr 21, 2021 17:34 |
|
Azure AD can use Yubikeys as well if a phone app isn't an option.
|
# ? Apr 21, 2021 17:35 |
|
Yeah, Azure AD supports all sorts of alternatives. Hardware tokens, phone app, SMS, phone call. Granted, you should try to steer clear of the last two if you can, rather than having to rip them away when you decide you want to handle how insecure they are, but yeah. I'd want a real strong use case that isn't being met before going with a third party MFA solution.
|
# ? Apr 21, 2021 17:36 |
|
Duo honestly isn't too bad and is fairly decently priced (the $3 a month version will handle 99% of use cases) our big reason for using it at a few clients is they want on-prem admin access to servers protected, which is not done well by Azure MFA yet; unless I'm mistaken, you need to host your own MFA VM to handle any on-prem servers.
|
# ? Apr 21, 2021 19:09 |
|
I've tried to explain to the CEO about 500 ways that as long as we only have a 2 mbps connection at the site where our file server is, we probably aren't going to be able to do automatic offsite backups. Kind of hard to convince him of this because the guy who was my supervisor when I started (not a tech guy, gone for about a year) got sold around $4,000 in NASes and who knows how much else in labor to do this; it is a beautiful one site backup, but for anything else, pretty useless. I've tried to break it down as a math equation where there are not enough minutes in the week for it to complete even the backup of our servers, let alone the files, I've tried using lots of different analogies (who says a lit degree is usless in IT), I've enlisted the ISP and a third party tech company to back me up. Any ideas? Really isn't an option to improve the speed without switching ISPs, which is a tough sell because the ISP is a donor.
|
# ? Apr 21, 2021 19:32 |
|
There's only really one way to improve the bandwidth at a site, and it seems like your employer is resistant to that
|
# ? Apr 21, 2021 19:37 |
|
2 mbps!?! Yeah, that's kind of a problem when doing anything modern.
|
# ? Apr 21, 2021 19:44 |
|
Internet Explorer posted:Yeah, Azure AD supports all sorts of alternatives. Hardware tokens, phone app, SMS, phone call. Thanks Ants posted:Azure AD can use Yubikeys as well if a phone app isn't an option. Modern Hardware Token Auth. is freaking awesome, I hope they bring it to consumers services. If there are any ITSec Goons is this ever a real possibility? I'd love to login to all of my social media with some kind of Keyfob + MFA.
|
# ? Apr 21, 2021 19:52 |
|
Internet Explorer posted:2 mbps!?! Yeah it sucks. We actually have a site where we get 20 up from the same provider but the main office is reliant on fixed wireless.
|
# ? Apr 21, 2021 22:01 |
|
Rick posted:Any ideas? Really isn't an option to improve the speed without switching ISPs, which is a tough sell because the ISP is a donor. When was the last time you updated your resume?
|
# ? Apr 21, 2021 22:08 |
|
Rick posted:Yeah it sucks. We actually have a site where we get 20 up from the same provider but the main office is reliant on fixed wireless. Radio wireless is capable of way more bandwidth than that, maybe they've got bad distance to the tower or something
|
# ? Apr 21, 2021 22:10 |
|
Rick posted:I've tried to explain to the CEO about 500 ways that as long as we only have a 2 mbps connection at the site where our file server is, we probably aren't going to be able to do automatic offsite backups. Kind of hard to convince him of this because the guy who was my supervisor when I started (not a tech guy, gone for about a year) got sold around $4,000 in NASes and who knows how much else in labor to do this; it is a beautiful one site backup, but for anything else, pretty useless. I've tried to break it down as a math equation where there are not enough minutes in the week for it to complete even the backup of our servers, let alone the files, I've tried using lots of different analogies (who says a lit degree is usless in IT), I've enlisted the ISP and a third party tech company to back me up. When you ask for ideas, are you looking for a technical solution or a political one? Because if you've already done the math and had two other parties check your sanity and dug in your heels, you probably know there isn't a technical option. If what you're saying is true, then the fact of the matter is that you yourself cannot make him understand or serve as the interlocutor for any other authorities to make him understand. Now, if he's periodically coming down to your office in an apparent fugue state, obliging you to explain the same loving thing again, and walking off with a frown, then maybe fifteen minutes of weekly vexation is just part of your job description. If he's periodically giving you an order to get the backups going, you try to make him understand why it's a bad call, and he relents in the face of your resistance, then your explanations are counterproductive; he's just walking away with the impression that there was something unsatisfying about how he understood the issue, and that's keeping the argument alive in his head - but if you can successfully remand him each time, then you've got at least some kind of authority with him. Could be time to just start being as concise as possible with "The bottom line is that I need a new ISP to do that. We've been through it before, but this is my expertise and I'm asking you to just trust me." But frankly, you know what I would I do? I'd stop arguing and just spin up a loving backup system. I'd apply my own unilateral judgement about which assets are the most important, with a budget of, say, ten gigabytes per day, and I'd configure the system to back those files up in reliable increments. The very first thing getting this treatment would be my rear end-covering email records documenting my objections to the system. I'd come up with an expedient off-the-shelf solution to stage and transfer all the other poo poo with whatever bandwidth I had leftover, completely catch-as-catch-can. I would periodically bring up the situation as an ongoing issue and say things like "It has been two months of running backups at full speed, 24 hours a day, and we have only transferred 1.3% of the 100 terabytes on our file servers." I would feel no guilt about the inadequacy of the system or any other projects that were left undone in the meantime, and leave work at 5pm sharp every day. In the event of an emergency, I would have more material backed up than if I did nothing, I would have the warrant to indulge in a fight if somebody pegged me as the scapegoat, and I would have absolutely no angst because I did everything I could. Just out of curiosity, how much data do you have on hand at a given time? Ballpark, here.
|
# ? Apr 21, 2021 22:22 |
|
Just do the off-site backups. When your bandwidth is saturated 24/7 maybe you get to have a more fruitful conversation about it.
|
# ? Apr 21, 2021 23:17 |
|
I mean, if you can't do offsite backups, then do onsite backups with those NASes and/or tapes. It will cover a lot of failure modes. Just make sure the decision makers know that it isn't a best practice and won't cover all failure modes.
|
# ? Apr 21, 2021 23:20 |
|
Kazinsal posted:When was the last time you updated your resume? Fairly recently, although that is less tech related and more related to company culture that kind of sucked but the people I really disliked have all quit or are quitting by the end of the month so I'm curious to see if it's better without them. Gerdalti posted:Just do the off-site backups. When your bandwidth is saturated 24/7 maybe you get to have a more fruitful conversation about it. Did this! Very quickly got a "okay turn it off we'll evaluate our ISP or come up with another solution" but after a lot of work ultimately didn't make a change. It's not just the donation factor, we are not near fiber and we have had some really bad experiences with Cox cable that make them a difficult choice. mewse posted:Radio wireless is capable of way more bandwidth than that, maybe they've got bad distance to the tower or something Yeah, we are basically in the worst spot possible, ISP says if we were a block closer or a block farther it would be much better. Eikre posted:When you ask for ideas, are you looking for a technical solution or a political one? Because if you've already done the math and had two other parties check your sanity and dug in your heels, you probably know there isn't a technical option. It's definitely the periodic fugue state thing, and if anything he thinks I'm way better at this than I am (I'm the moron predecessor to whoever next takes this job, and the guy I replaced was pretty good he just was stuck in 2005). So I guess technical technical because I actually understand the politics of it. Me unilaterally doing something like that would make the point, but the poo poo that the company does is important (literal life and death), and if there is guilt, it is in impacting that. And while the ISP loving sucks at delivering internet to us, they are really good at delivering stable phone service and real 24/7, in town support and have bent heaven and earth at 3:00 AM to get our phones going again when they've hit disasters. And unfortunately my run at my last job (where I had the confidence and truly felt supported enough to just decide for everyone to switch the ISP [and the stakes were lower]) taught me that neither Cox or Century Link can provide that, at least for at the price point we pay (even if their service agreements say they provide this, been left dead in the water several times). So I think you are getting at the truth of the matter that my own indecisiveness on this issue probably is why it's actually an issue. It's only about 1.5 TB backup now, I successfully sold them Office 365 which lowered what we needed to back up massively. Internet Explorer posted:I mean, if you can't do offsite backups, then do onsite backups with those NASes and/or tapes. It will cover a lot of failure modes. Just make sure the decision makers know that it isn't a best practice and won't cover all failure modes. Eventually this is what I started doing until COVID hit, I was picking NAS #2 up on Tuesday from location #2 and driving it to location #1, cloning the backups from NAS #1 over night, then driving NAS #2 back to location #2 on Wednesday. An auditor dinged us for this though anyway and wants an offsite backup and I know they aren't exactly wrong. Post COVID I am only physically at work about two hours a week.
|
# ? Apr 22, 2021 05:45 |
|
|
# ? Apr 25, 2024 23:11 |
|
Rick posted:Eventually this is what I started doing until COVID hit, I was picking NAS #2 up on Tuesday from location #2 and driving it to location #1, cloning the backups from NAS #1 over night, then driving NAS #2 back to location #2 on Wednesday. An auditor dinged us for this though anyway and wants an offsite backup and I know they aren't exactly wrong. Post COVID I am only physically at work about two hours a week. I mean, look. You were doing the "right" thing for someone who has no bandwidth. If the auditors were still dinging you, as an engineer, sometimes you gotta let things go and be like "well, that's the best we can do with what we got. Anything else is a business decision" and let the cards fall as they may. I'd say that if you were driving NASes around, you've already gone further as an engineer as you probably should have been. Maybe the answer is to back up to tape and have Iron Mountain come pick it up every day. Maybe the answer is to throw money at an ISP to solve what sounds like a difficult ISP situation. But you've done what you can, and it's time for you to stick to your guns and force management to tackle the problem. At least that's my take on my understanding of what you've described.
|
# ? Apr 22, 2021 05:53 |