Register a SA Forums Account here!
JOINING THE SA FORUMS WILL REMOVE THIS BIG AD, THE ANNOYING UNDERLINED ADS, AND STUPID INTERSTITIAL ADS!!!

You can: log in, read the tech support FAQ, or request your lost password. This dumb message (and those ads) will appear on every screen until you register! Get rid of this crap by registering your own SA Forums Account and joining roughly 150,000 Goons, for the one-time price of $9.95! We charge money because it costs us money per month for bills, and since we don't believe in showing ads to our users, we try to make the money back through forum registrations.
 
  • Post
  • Reply
Zaepho
Oct 31, 2013

Hadlock posted:

Ok, this is an acceptable answer that works inside our existing enterprise ecosystem, thank you sir I will take this into consideration. This is the closest thing I've seen to a "microsoft approved" design so far... but surely there's something baked in to powershell for this?

It's being worked on but honestly its not there yet from my opinion. OneGet from PS v5 is supposed to provide these capabilities but it's not yet released or complete. Looks pretty cool (thing CPAN or something along those lines) and based on NuGet so building your own repository might be able to serve dual purposes.

Adbot
ADBOT LOVES YOU

PUBLIC TOILET
Jun 13, 2009

PUBLIC TOILET posted:

I wanted to run a GPO issue by you folks and see what you think might be the cause. We have AGPM 4.1 on our GPO server and I'm working on a couple of policies through that. I'm seeing an issue affecting numerous policies where if I generate an HTML report on any of them via AGPM and I look at the links section of the report, the links section will be blank. If I drill down to the actual policy under Group Policy Objects, I can see the OU links there. Our environment has replication across four DCs. Has anyone encountered this issue before? Is it a replication issue or a hosed up AGPM? Or perhaps policies are broken? I'm doing everything correctly (check out, modify, check in, deploy) in AGPM but the reports aren't displaying the proper links information. As an example, I have one policy that has four ADUC accounts under Security Filtering when looking at the actual policy. If I look at the report for that policy in AGPM, it only displays two ADUC accounts.

I saw this hotfix but the symptoms don't sound similar and that is for AGPM 4.0.

I figured this out in case anyone else experiences the issue. I'm not too familiar with AGPM so I did some research. Turns out I wasn't importing the actual production Group Policy into AGPM so they weren't synchronized and that was causing the missing items in the reports.

capitalcomma
Sep 9, 2001

A grim bloody fable, with an unhappy bloody end.

Hadlock posted:

Ok, this is an acceptable answer that works inside our existing enterprise ecosystem, thank you sir I will take this into consideration. This is the closest thing I've seen to a "microsoft approved" design so far... but surely there's something baked in to powershell for this?

Look in to Powershell profiles. They'll be very familiar to a linux admin with a customized bash profile. I've got a half-dozen functions and some global variables in a profile file that gets pushed to all IT workstations. They load automatically whenever Powershell or Powershell ISE is opened on our computers. Group Policy FILE preferences seems to work just fine at pushing a profile file out.

capitalcomma fucked around with this message at 19:29 on Sep 25, 2014

Zaepho
Oct 31, 2013

capitalpunctuation posted:

Look in to Powershell profiles. They'll be very familiar to a linux admin with a customized .bashrc file. I've got a half-dozen functions in a profile file that gets pushed to all IT workstations. They load automatically whenever Powershell or Powershell ISE is opened on our computers. Group Policy FILE preferences seems to work just fine at pushing a profile file out.

That's quite a clever solution to the issue. My only question here is whether or not users that login without creating a profile (i.e. as a scheduled task) invoke a Group Policy update?

capitalcomma
Sep 9, 2001

A grim bloody fable, with an unhappy bloody end.

Zaepho posted:

That's quite a clever solution to the issue. My only question here is whether or not users that login without creating a profile (i.e. as a scheduled task) invoke a Group Policy update?

Are you asking whether or not a user will need to manually run gpupdate to get the file? No, just define the Group Policy Preferences setting in Computer Configuration in a GPO, rather than user, and all of the computers will get the file without user intervention.

If you're asking whether or not the functions can be invoked in a scheduled task, I'm not sure, never tested it.

capitalcomma fucked around with this message at 05:31 on Sep 25, 2014

Zaepho
Oct 31, 2013

capitalpunctuation posted:

Are you asking whether or not a user will need to manually run gpupdate to get the file? No, just define the Group Policy Preferences setting in Computer Configuration in a GPO, rather than user, and all of the computers will get the file without user intervention.

If you're asking whether or not the functions can be invoked in a scheduled task, I'm not sure, never tested it.
I didn't realize when i replied that it was a computer GPO and you place it in the default global profile location, the scheduled task should pick it up just fine.

capitalcomma
Sep 9, 2001

A grim bloody fable, with an unhappy bloody end.

Zaepho posted:

I didn't realize when i replied that it was a computer GPO and you place it in the default global profile location, the scheduled task should pick it up just fine.


My fault, I should have mentioned you can push it to an all-users, all-shell location in system32, comparable to /etc/profile. My edited post should be a bit clearer.

capitalcomma fucked around with this message at 17:04 on Sep 25, 2014

BaseballPCHiker
Jan 16, 2006

Has anyone ran into the SCCM software center displaying old apps/packages even after you've removed them? I cant seem to clear them out for users. Right now I have an install Office 2013 package that I use as an option to install during a lite touch deployment and for some reason it shows up as available for users in the software center. I dont want to have to delete the package and then remake one for my deployment but I'm afraid thats what it may come to.

I never know if a fix in SCCM is to just wait until tomorrow and see if it's fixed itself or make some other move. Things move so drat slow with it.

BaseballPCHiker
Jan 16, 2006

Holy poo poo I shouldnt have mentioned enabling TPM on all of our machines. What in the holy hell did I get myself into. This looks like its going to be hard as hell and super complicated.

Thanks Ants
May 21, 2004

#essereFerrari


BaseballPCHiker posted:

Has anyone ran into the SCCM software center displaying old apps/packages even after you've removed them? I cant seem to clear them out for users. Right now I have an install Office 2013 package that I use as an option to install during a lite touch deployment and for some reason it shows up as available for users in the software center. I dont want to have to delete the package and then remake one for my deployment but I'm afraid thats what it may come to.

I never know if a fix in SCCM is to just wait until tomorrow and see if it's fixed itself or make some other move. Things move so drat slow with it.

Pretty sure this is one of those "leave it a few hours" occasions.

Gyshall
Feb 24, 2009

Had a couple of drinks.
Saw a couple of things.

BaseballPCHiker posted:

Holy poo poo I shouldnt have mentioned enabling TPM on all of our machines. What in the holy hell did I get myself into. This looks like its going to be hard as hell and super complicated.

IMO this is something you do on image/deployment of machines. Kind of scary to just turn on too.

Fruit Smoothies
Mar 28, 2004

The bat with a ZING
I'm a bit out of my depth with planning this. I need a truly redundant setup, and I have an almost unlimited budget. I basically need constant uptime of a domain and ~2TB of files housed in SMB shares.
In all my previous challenges, I've used 2+ DCs, and used DFS for the file sharing. In this case, however, I have an ancient application that uses flatfile (CSV) "database" and basic file locking to handle its operation, and an accounting package that uses FoxPro databases. Both of these rule out DFS as it's not sensitive enough to low level file operations.
My basic understanding of what I need is:

File Storage (SANs)
File Server Cluster
Fall over Hyper V pointing to VHDX on cluster.

Forgive my ignorance, and PLEASE help me understand the basic segments of this operation! Many thanks.

incoherent
Apr 24, 2004

01010100011010000111001
00110100101101100011011
000110010101110010
Give up the Hyper-V unless you're pairing it with SCCM. If you really have unlimited budget get a VM HA specialist to get in and do an inventory and spec out what you really need.

Being good system admin is knowing where you skill limit is.

goobernoodles
May 28, 2011

Wayne Leonard Kirby.

Orioles Magician.
What's a decent, preferably cheap option for backing up a physical server 2012 domain controller/file server without an existing storage target for backups? Either a cheap storage target + software or a cloud solution. lovely DSL connection, so even if there was a decent ~*~cloud~*~ option it might not be the best option unless it does incremental. It's a pretty lightly used server so changes would be minimal. This is for my company president's side business half a state away.

e: I'm using backup assist to a Drobo NAS (I didn't pick it) at a similar site. Should I just get another NAS and go with backup assist again?

goobernoodles fucked around with this message at 22:14 on Sep 25, 2014

Thanks Ants
May 21, 2004

#essereFerrari


BackupAssist is good, and you can pair it with cloud providers if you want as well.

thebigcow
Jan 3, 2001

Bully!

Fruit Smoothies posted:

I'm a bit out of my depth with planning this. I need a truly redundant setup, and I have an almost unlimited budget. I basically need constant uptime of a domain and ~2TB of files housed in SMB shares.
In all my previous challenges, I've used 2+ DCs, and used DFS for the file sharing. In this case, however, I have an ancient application that uses flatfile (CSV) "database" and basic file locking to handle its operation, and an accounting package that uses FoxPro databases. Both of these rule out DFS as it's not sensitive enough to low level file operations.
My basic understanding of what I need is:

File Storage (SANs)
File Server Cluster
Fall over Hyper V pointing to VHDX on cluster.

Forgive my ignorance, and PLEASE help me understand the basic segments of this operation! Many thanks.

FYI the FoxPro thing and probably the CSV thing will choke and mysteriously corrupt themselves on SMB 2.

Fruit Smoothies
Mar 28, 2004

The bat with a ZING

thebigcow posted:

FYI the FoxPro thing and probably the CSV thing will choke and mysteriously corrupt themselves on SMB 2.

FoxPro is Opera II which is a massive accounting package, and seldom has problems, and the CSV file has been working for 6 years without too many problems. It's terrible, but surprisingly resilient!

BaseballPCHiker
Jan 16, 2006

Gyshall posted:

IMO this is something you do on image/deployment of machines. Kind of scary to just turn on too.

Yeah so far I've had to fight to get TPM turned on and working in a way that I can push out company wide. Now comes the fun part of trying to add this into an inherited task sequence held together by bubblegum. I think I may just try to create a brand new task sequence from the ground up. Our current lite touch deployment leaves a lot to be desired and there are a ton of old scripts and garbage that run as part of it.

Do you have any guides or links you can send my way? I've looked and there doesnt seem to be any good O'Reilly type books on this subject.

KillHour
Oct 28, 2007


Tequila25 posted:

I just started a new job and I have a chance to redesign our whole network infrastructure from scratch. The current plan we have goes like this. We have all the hardware already, and use have two internet connections, one cable modem and one fiber. We use the cable for office internet, fiber for the web servers.


I personally was thinking of adding a Cisco router and setting it up like this so we can have fault tolerance in case one of our providers goes down.


We are running our website as a storefront, so we are very concerned about security and keeping customer data safe. Would it be worth adding the expense of the Cisco router? Any other suggestions?

Forgive the lovely MS Paint, but this is what a proper HA setup looks like.



Edit; Joke Answer:

:yayclod: :yayclod:

KillHour fucked around with this message at 23:30 on Sep 25, 2014

skipdogg
Nov 29, 2004
Resident SRT-4 Expert

Fruit Smoothies posted:

I'm a bit out of my depth with planning this. I need a truly redundant setup, and I have an almost unlimited budget. I basically need constant uptime of a domain and ~2TB of files housed in SMB shares.
In all my previous challenges, I've used 2+ DCs, and used DFS for the file sharing. In this case, however, I have an ancient application that uses flatfile (CSV) "database" and basic file locking to handle its operation, and an accounting package that uses FoxPro databases. Both of these rule out DFS as it's not sensitive enough to low level file operations.
My basic understanding of what I need is:

File Storage (SANs)
File Server Cluster
Fall over Hyper V pointing to VHDX on cluster.

Forgive my ignorance, and PLEASE help me understand the basic segments of this operation! Many thanks.

We handle our really important replication at the block level on our EMC SAN's with Recoverpoint. Not cheap but it works for what we need it to do.

What's your environment like? Multiple sites? Single large site? What are your recovery objectives?

Fruit Smoothies
Mar 28, 2004

The bat with a ZING

skipdogg posted:

We handle our really important replication at the block level on our EMC SAN's with Recoverpoint. Not cheap but it works for what we need it to do.

What's your environment like? Multiple sites? Single large site? What are your recovery objectives?

The main concerns I have with SANs, is the sensitivity of the CSV files. In order for the program to work, we have to disable write-behind cache on the servers AND the client PCs! The program has thrown locking errors before, and I wonder if a SAN fabric will even work. The software doesn't even function on a Samba 4 share on Ubuntu.

The environment does have multiple sites, but it's departmental so the satellite offices have their own servers (on domain). They access the CSV app via Remote Desktop to avoid VPN latency in the CSV file writes.

Recovery isn't too bad, as the majority of the 2TB data is images of various items, and could probably be ignored for a week without too much complaining. The CSV app runs the business, however, and needs almost 24/7 uptime.

thebigcow
Jan 3, 2001

Bully!
I'm sure know what you're doing and you've thought about this already but just in case, have you looked into moving to something that wasn't designed to run on a single win9x machine?

skipdogg
Nov 29, 2004
Resident SRT-4 Expert

I can't imagine something that runs off a CSV file couldn't be ported to a SQL database. I know how critical some of these legacy apps can be though so I feel your pain.

What are you doing right now for backup?

Fruit Smoothies
Mar 28, 2004

The bat with a ZING

skipdogg posted:

I can't imagine something that runs off a CSV file couldn't be ported to a SQL database. I know how critical some of these legacy apps can be though so I feel your pain.

What are you doing right now for backup?

There are actually times where I am paid to basically sit on site and do nothing (because uptime is so critical) and I've written a JS / RESTFUL alternative, which I am trying to sell.

Backup at the moment is powershell scripts running wbadmin to various folders on a NAS for disaster recovery. The CSV is all zipped and stuck on memory sticks periodically too, the AD is protected with DCs at the satellite sites.

thebigcow posted:

I'm sure know what you're doing and you've thought about this already but just in case, have you looked into moving to something that wasn't designed to run on a single win9x machine?

They are definitely looking at alternatives, including mine. I am a contractor, so have very little say in the product itself.

Dans Macabre
Apr 24, 2004



Nice! thanks

kaynorr
Dec 31, 2003

Fruit Smoothies posted:

They are definitely looking at alternatives, including mine. I am a contractor, so have very little say in the product itself.

In all honesty, the best thing they can spend their money on is getting off of this antiquated product. It's fragile and hell and sooner or later it WILL break, and it sounds like an immense amount of company revenue would be on the line. When that happens, there is no degree of "well, I warned you" that will stop the knives from coming out.

SSH IT ZOMBIE
Apr 19, 2003
No more blinkies! Yay!
College Slice
Hm, what are some good ways to deploy software to PCs via SCCM as part of an MDT imaging process? Our MDT and SCCM servers are not integrated.
Since they're not integrated, I'm wondering if we should push SCCM as part of the MDT imaging process. Once SCCM is installed, a device record will show up in SCCM. Maybe then some sort of script can run on the client, asking whoever is imaging the PC what collection they want to put the PC in, and then maybe kick off a machine policy update? But there'd be no good way for whoever is imaging the PC to check on the status of software deployment without going into SCCM directly...

I don't really work on desktops much, but I've been SCCM a lot lately due to mergers and lack of staffing. I got it set up so collections can be built off AD groups, and software comes down fine. Just, would be good to somehow make it easily part of the imaging process. I've been pushing client software for servers I manage out to desktops through SCCM since it's so drat easy and straightforward.

We have a new desktop guy and it seems like he's doing silly things in MDT, having a ton of base images and pre-installing software in the image. We pay for SCCM but barely leverage it.

Zaepho
Oct 31, 2013

SSH IT ZOMBIE posted:

Our MDT and SCCM servers are not integrated.

This is the first problem to rectify. Just do it properly in SCCM integrated with MDT and you'll have wonderful consistency between image builds and pushed software and the ability to use all of your SCCM packaged apps in your task sequences, and patch your machines completely, including all of the relevant software, and, and, and... etc.

Give your MDT guy a Gibbs slap and get him doing it right.

Sacred Cow
Aug 13, 2007
efb ^^^

SSH IT ZOMBIE posted:

Hm, what are some good ways to deploy software to PCs via SCCM as part of an MDT imaging process? Our MDT and SCCM servers are not integrated.
Since they're not integrated, I'm wondering if we should push SCCM as part of the MDT imaging process. Once SCCM is installed, a device record will show up in SCCM. Maybe then some sort of script can run on the client, asking whoever is imaging the PC what collection they want to put the PC in, and then maybe kick off a machine policy update? But there'd be no good way for whoever is imaging the PC to check on the status of software deployment without going into SCCM directly...

I don't really work on desktops much, but I've been SCCM a lot lately due to mergers and lack of staffing. I got it set up so collections can be built off AD groups, and software comes down fine. Just, would be good to somehow make it easily part of the imaging process. I've been pushing client software for servers I manage out to desktops through SCCM since it's so drat easy and straightforward.

We have a new desktop guy and it seems like he's doing silly things in MDT, having a ton of base images and pre-installing software in the image. We pay for SCCM but barely leverage it.

I haven't messed around too much with SCCM image deployments yet but its pretty trivial to get MDT integrated and import existing Task Sequences into SCCM. Is there any reason why you would want to keep it separated? You can assign some pretty specific permissions if you don't want the desktop guy futzing around in the other modules. If I remember correctly the best practice is to use MDT to make a Gold Image then use SCCM to deploy assigned application packages automatically as part of the TS as apposed to trying to shoehorn it with additional scripts. Your idea just sounds way more complicated then it needs to be unless there's some sort of compliance issue getting in the way.

Sacred Cow fucked around with this message at 17:05 on Sep 27, 2014

SSH IT ZOMBIE
Apr 19, 2003
No more blinkies! Yay!
College Slice
Nope, no compliance issues. Guess we should look at this! Thanks.

FISHMANPET
Mar 3, 2007

Sweet 'N Sour
Can't
Melt
Steel Beams
So what can MDT do for OS installs that SCCM can't do on its own? I've found SCCM to be generally satisfactory for installs.

dox
Mar 4, 2006

FISHMANPET posted:

So what can MDT do for OS installs that SCCM can't do on its own? I've found SCCM to be generally satisfactory for installs.

The general rule of thumb is to use MDT to build your reference images and then use SCCM for deployment.

FISHMANPET
Mar 3, 2007

Sweet 'N Sour
Can't
Melt
Steel Beams
I guess I'm not even sure what the point of a reference image is for. Software and updates move so quickly you'd be rebuilding the image every day or when you actually deploy it you just have to reinstall all the software in your image since it's out of date in the image.

We use SCCM to integratae updates into the base Win 7 image, I don't know where we can get much more time savings. If I was deploying a huge pile of machines at once I might want an image, but if I want something where I can drop a random machine on my desk and deploy an OS to it, I don't know what a "reference image" would get me.

Number19
May 14, 2003

HOCKEY OWNS
FUCK YEAH


I rebuild my reference images when there's a major feature that I want to add that will that can't be offline serviced. A good example is when we moved to the .NET Framework 4.5.1. Windows 7 can't install the update version offline and installing it during deploy will add a lot of time and extra update rounds. It's better to just do a build and capture with it included.

Upgrading internet Explorer versions is another good example of why you'd want to do this.

thebigcow
Jan 3, 2001

Bully!
NIC drivers that aren't part of Windows for some reason. It doesn't have to be anything fancy.

FISHMANPET
Mar 3, 2007

Sweet 'N Sour
Can't
Melt
Steel Beams
We have driver packages for each model of hardware we support, and SCCM applies the correct one based on the computer model.

I don't see why I'd want to inject every NIC driver under the sun into my driver store. And also it's a lot faster and easier for me to build a new driver package (which I'd have to do anyway, I can't put every driver for anything ever into the reference image) than it is for me to build a new image. Also isn't a reference image supposed to be hardware agnostic and not have any drivers?

GreenNight
Feb 19, 2006
Turning the light on the darkest places, you and I know we got to face this now. We got to face this now.

FISHMANPET posted:

Also isn't a reference image supposed to be hardware agnostic and not have any drivers?

Yes, that's why the best way is to create a virtual machine and then capture that. Then when you want to update your reference image, you boot up the VM, make your changes and updates and then re-capture and shut it down.

thebigcow
Jan 3, 2001

Bully!
NIC drivers are helpful for letting your new machine communicate with whatever is installing everything else. I don't know how much of a problem this still is but Broadcom and XP :(

FISHMANPET
Mar 3, 2007

Sweet 'N Sour
Can't
Melt
Steel Beams
Yeah, but SCCM takes care of that.

As far as I'm aware a reference image should be completely hardware agnostic. When running the Task Sequence in SCCM I apply drivers based on the computer model. So booted into the SCCM boot image, it stages the image on the drive, then injects the appropriate drivers into that image, then reboots into the OS to install, and it has all the drivers it needs. I'd be doing that anyway, so even if it wasn't best practice to keep NIC drivers out of the image, I don't understand what I'd gain by putting them in the reference image, since by running it in a Task Sequence I'm injecting the right drivers (and only the right drivers).

Adbot
ADBOT LOVES YOU

BaseballPCHiker
Jan 16, 2006

One thing that MDT is nice for is if you want to do lite touch deployments you can add all of your software options, what OU to put the PC in, and some other options and leave it to the tech to configure as they image it. That's nice for us since we have images that get wildly different software setups depending on their department.

  • 1
  • 2
  • 3
  • 4
  • 5
  • Post
  • Reply