Register a SA Forums Account here!
JOINING THE SA FORUMS WILL REMOVE THIS BIG AD, THE ANNOYING UNDERLINED ADS, AND STUPID INTERSTITIAL ADS!!!

You can: log in, read the tech support FAQ, or request your lost password. This dumb message (and those ads) will appear on every screen until you register! Get rid of this crap by registering your own SA Forums Account and joining roughly 150,000 Goons, for the one-time price of $9.95! We charge money because it costs us money per month for bills, and since we don't believe in showing ads to our users, we try to make the money back through forum registrations.
 
  • Post
  • Reply
Zaepho
Oct 31, 2013


wow.. as horible as that file name is...

Bitlocker Drive Encryption HardDrive ConFiGuration.exe

Adbot
ADBOT LOVES YOU

Zaepho
Oct 31, 2013

orange sky posted:

Yep I know it by heart because it (kind of) makes sense.

Does anyone here know how to setup a DirectAccess lab using NAT (No public IP)? My head is hurting just thinking about how I'm going to set up this poo poo and test it properly.

You can do it no problem, just tell it that it's behind a NAT and it's OK with it. You'll still want to do a dual nic setup to get the full effect. I've seens pretty good blog entry on doing just this. I'll have to see if i can dig it up. Also do it on 2012R2 and plan to stick a hardware load balancer in front of a couple servers. This will make your life much better since it's not a windows server connected to both the internet and your internal network.

This is probably the best article on building a lab for DA: http://blogs.technet.com/b/meamcs/archive/2012/05/14/windows-server-2012-direct-access-part-2-how-to-build-a-test-lab.aspx it's not amazing but should give you enough to get it started.

Zaepho fucked around with this message at 18:24 on Nov 20, 2014

Zaepho
Oct 31, 2013

orange sky posted:

I'd love to take a look at that blog entry, thanks. This will go into production with NAT though, so why would I use a dual nic setup, load balancing? What concerns me the most is setting up the NAT, since I'm using a lovely 3G pen. I should probably try this at home with my decent router and NAT port 443 to the internal switch IP used on the VM right?

If you can do the NAT bit in your lab, definitely do it. the dual NIC setup would be for if you can't make the NAT parts work for you lab. We run DA in a single NIC setup behind NAT in our environment and it works great.

Aside from that do at least a 2 node cluster for the DA servers and a super clustered Network Location Server setup (NLS is just any SSL Website accessible internally but not externally or over DA). Putting the DA boxes behind a hardware load balancer with "sticky" sessions is the way to go from the outside. Just don't try to terminate the SSL session on the load balancer.

DA is awesome but it's certainly different from traditional VPN solutions (in a good way I would suggest). If you can manage to get IPv6 running internal to your network, it's even more awesome.

Zaepho
Oct 31, 2013

orange sky posted:

Do any of you guys have some info on the advantages/disadvantages of using DirectAccess with Public IP's / Behind Edge / Using only one interface behind edge? As in, what exactly do we win or lose by choosing one option and not the other. You'd really do me a huge favour if you had something.

E: I've found it. Teredo and 6to4 won't work. I don't know why the gently caress Microsoft decided not to document stuff properly anymore, I have to go through technet and hope the loving search engine turns up what I want in that hellhole of a place, loving hell it pisses me off.

Ended up reading it in a book called "Directaccess best practices and troubleshooting".

Behind Edge. Don't place windows servers directly on public IPs unless you absolutely have to. Just push port 443 back. Like you found you lose Teredo and 6to4. Both of which are unnecessary.

Zaepho
Oct 31, 2013

lol internet. posted:

Does Windows Storage Pools/ReFS do anything magical in performance compared to hardware raid? (Home solution.)

performance? no. ReFS is great since it's journaled and the write is read to verify correctness before the write action is considered complete.

Storage pools mean that the drives don't all have to be the same which is pretty awesome, and you can swap drives around pretty easily.

Zaepho
Oct 31, 2013

GreenNight posted:

I meant the SCCM client on a server, not the server software suite.

As I understand it (and as far as I'm concerned MS licensing is a dark black art designed so that Microsoft can charge anyone anything they want to), the System Center tools are no longer licensed individually, only as the entire Suite. So you get SCOM, SCCM, DPM, etc all rolled into the one license.

It makes a great case for my as a consultant to be able to say "Hey since we just rolled SCOM on all of your servers, lets roll out SCCM and get rid of X Other Patching product and save you some money! *cough*so you can pay me some more*cough*"

This works really well when you license the physical servers with DataCenter and run a shitload of VMs on them. The DataCenter physical license passes down to all of the virtual guests.

Zaepho
Oct 31, 2013

BaseballPCHiker posted:

Is there a way in SCCM 2012 to make a package and deployment use a specific distribution point? I setup a secondary dp and uploaded the content and when I deployed the software update package it seemed to still be going over our VPN to the primary site. I thought it would just automatically take the content from the closest dp but I guess not.

I get so frustrated working with SCCM. I'm trying to educate myself on it as much as I can but without work scheduling class time due to how busy we are I'm left to books and blogs which have only taken me so far.

Content location is dependent on the boundary group that the DP is assigned to and the which Boundary Group the client falls into at the time. Plus add in the ability to fall back to another DP if the content is not available within the current boundary group.

What we do for boundary groups on our engagements is to create "Site Assignment" boundary groups that are used only for assigning clients to the correct site. Secondly we setup "Content Boundary Groups" that are used expressly for directing clients to the appropriate DP for their location.

Zaepho
Oct 31, 2013

BaseballPCHiker posted:

Thanks for the tip. Gives me something to look into. Do you have any general books or sites to recommend? I've been reading the windows-noob forum guides, the deploy-happiness blog, and bought the System Center mastering the fundamentals book as well.

I would echo what the others have said and add on that the MyITForum community is a great resource for SCCM. Its your best link to pretty much every SCCM MVP out there. Try to make it out to Ignite and see if you can network your way into chatting with some of the SCCM Community big names and you'll get a LOT of information that you'll never get from any book or training class.

SCCM is a tool you just have to work with and eventually it will click and things will start to make more sense. Unfortunately there's a lot of stuff to understand at the foundation level to be able to make the most of SCCM so it'll take some time.

Zaepho
Oct 31, 2013

BaseballPCHiker posted:

Anyone have any recommendations on MSI building software? I've been using Orca for a bit but feel like there's got to be some paid software out there that will work better.

WIX is great since you can check in the MSI build XMLs with the rest of your code. The devs at a software company i worked for absolutely loved it. nANT would build all the code run all of the tests, and build the installers and drop them for final testing and acceptance and bam out the door they would go. once it was setup and running it was easy to distribute stuff as well as easy to update installer processes.

http://wixtoolset.org/

Zaepho
Oct 31, 2013

Tab8715 posted:

The gently caress? Was this an official Microsoft step?

as a troubleshooting step this is fine. It tells you IF the firewall is a factor. You can then choose to fix the issue with the firewall since you now know it's a firewall issue. Pretty typical process of elimination troubleshooting in my book.

Zaepho
Oct 31, 2013

5er posted:

I'm way ahead of you, and the wisdom you speak is something I have to dispense almost daily to others. I just wanted to know if the problem as I described it is fixable, because some dumb gently caress is going to break things along the lines I described and I might get looked at to un-gently caress it up.

Ideally the system drive should be mirrored (at the hardware level), THEN build you Storage Space with all of the other drives using at least a parity space, and replace drives as needed there. Don't use anything from the OS disk in the storage space.

Zaepho
Oct 31, 2013

alanthecat posted:

Interesting. What does it do that Group Policy shouldn't do? (i.e. why aren't Microsoft baking those features into GP?)

Application/System component installations. For instance you could have a DSC policy that says that ServerB is a MyApp web frontend. Apply the policy and powershell can install all of the pre-requisites, install and configure IIS and MyApp and even add it to the load balancer if you code it up that way.

Zaepho
Oct 31, 2013

Hadlock posted:

If I want to do two domains on the same LAN I need to setup each device manually for each device that is domain joined right? And then DHCP for IoT that don't care about domain and just want to talk to the internet?

I have a home router running DD-WRT as the gateway that also serves up a local domain (DOMAIN_A), DHCP for the .100 subnet and DNS on 192.168.100.1

I want to setup a Primary DC for DOMAIN_B on 192.168.200.1, do I set it up with it's own DHCP, NetBIOS, but use 192.168.100.1 as the gateway? And then each domain-joined workstation/server gets hard-coded to use 192.168.200.1 for DNS + NetBIOS and just leave DHCP off for my domain-joined devices?

Trying to think of the best logical way to do this. I would prefer to keep DNS and DHCP enabled on the router as I have about 15 devices and I cycle through various linux/windows VMs etc without needing all of them to be domain-joined.

If you configure DD-WRT to forward DNS requests for specific domains to the DCs for those domains, you'll be fine. This basically involves setting up Stub records in the DD-WRT named config. No need for manual config on the domain joined devices, DNS will handle it just like it's supposed to.

Zaepho
Oct 31, 2013

Hadlock posted:

sounds like I need to research DNS further.

This is never a bad idea. Between everything today being heavily reliant on DNS and IPv6 "Coming SoonTM" when you will never be able to memorize all of the important IP addresses anymore, DNS is a foundation technology that you should have a really solid understanding of.

Zaepho
Oct 31, 2013

Cosmic D posted:

Also they still think it's fine to ask for a user's password so they don't have to go deskside and setup everything. Being new to the industry, is this acceptable business practice?
NO!!! gently caress NO! loving FUCKITY gently caress NO!

This teaches users it's OK to give up all of their usernames, passwords, social security numbers, small children, etc to anyone who purports to be "The Helpdesk". Like those guys who cold call from "Microsoft" because "Your computer has a virus! Were here to help fix it! Because we're Microsoft!" Don't train idiot users to be bigger idiots!

Zaepho
Oct 31, 2013

ghostinmyshell posted:

Does anyone know if this live event was recorded and put up somewhere to view? I had to miss it :smith:

http://www.microsoftvirtualacademy.com/liveevents/getting-started-with-powershell-desired-state-configuration-dsc

I too had to miss and was hoping to see a cool email in my inbox with a recording of the session. Hasn't show up yet so I'm guess probably not going to happen. I would really love on though.

Zaepho
Oct 31, 2013

BaseballPCHiker posted:

Anyone ever have any experience trying to deploy a vbscript through sccm? I've done plenty of batch files when needed but this is my first vbs I've gotten stuck with. The company uses this ancient script to pull info from AD and create an email signature that saves locally in users Outlook profiles. I tried in vain to explain how a hub transport rule would be much more simple, elegant and thorough with no luck. If I run cscript.exe //nologo \\server\share\sig.vbs and deploy it as mandatory it says that it runs successfully but I dont actually see the signature applied. I know the script works if I run it manually. Really want to re-do this thing in powershell or use the transport rule but cant convince the higher ups to let me spend the time to do it.

Is the package running interactively with the logged on user? If not, System has an awesome sig everywhere on you network!

Remember that SCCM agent runs as System and by default runs all packages within it's own context. NOT the user's session.

Zaepho
Oct 31, 2013

Hadlock posted:

Am I completely crazy here to want to run these 70% of machines headless? What can I do or say to explain the benefits here? Lots of resistance since this is not "the way we've always done it".

The goal is 100% headless. Start with the 70% and then work towards eliminating apps that need to be logged in on the console to function.

Zaepho
Oct 31, 2013

mayodreams posted:

The big sticking point for us is that 90% of our user population is still on XP and a lot of these 'apps' are only 16/32 bit so we expect a lot of pain with the 8.1 migration this year.

Any pain on XP can easily be mitigated by $10,000/incident support calls! Rack up a few of those and the finance guys will untwist a lot of panties for you in a drat hurry.

Zaepho
Oct 31, 2013

FISHMANPET posted:

I'm gonna post this in the Storage thread too, but has anyone seen problems with slow storage performance on Server 2012 R2? I've got an open case with Microsoft but we're a month in and still seem to just be flailing randomly at even identifying a problem. I've heard mumblings of others having problems, but wondering if anyone has noticed anything.

what kind of storage and what kind of slowness are we looking at?

Zaepho
Oct 31, 2013

Tony Montana posted:

This is Server 2012. So the fsmo seize worked but AD is broken and it's acting as if it's not a domain controller. Now Server 2012 through the Generation ID attrib which is exposed in VMWare (5.5 is our prod) knows that it's been cloned and it's not the VM it was. This kicks off the 'Directory Security' features which were recently introduced, but that shouldn't prevent it operating as a DC. However I'm getting some errors in the DNS logs about AD being broken so DNS can't start properly and AD logs about DNS being broken so it can't start properly.

Before I mess around with it again and try and tease out what I'm missing, anyone know a step I might have missed? I updated the DNS server config pulling out the forwarders it can't access (any of the them, because it's an 'air-gaped' bubble). Running a dcdiag and it has a poo poo about all the DCs it can't now see but it says it's ok and it passes the DNS tests.

keep calm and ignore it for a half hour or so. AD does this crazy thing where it tries to do an initial sync from another DC after it boots up before it starts acting as a DC. Add into that the fact that DNS won't start until AD does and you're in for a good time! After about half an hour AD gives up trying an initial sync and starts, allowing DNS to start up as well. Clean out the rest of the domain controllers in your "new" lab domain and you should be able to avoid the start-up delay in the future.

Zaepho
Oct 31, 2013

BaseballPCHiker posted:

Yeah going from SP1 to R2 looks relatively harmless. In any event what I'm ending up doing is setting up a new VM with more resources thrown at it then the old server and installing SCCM 2012 R2 CU4 on it. Once I get MDT and WAIK, WSUS and eveything else setup then I'll start removing roles from the current primary site and adding them to the new one. If I'm reading all of the technet articles correctly I can then create a boundary group and add the second new server. Once all of the roles are reassigned to the new server and the clients are pointing there I can decommission the old one. I hope it works like that at least.

You making things a little harder on yourself than needed. Stand up a brand new site (NEW SITE CODE!!!) with all the roles that it needs, all configured and happy. Nuke the Boundary Groups on the Old site, Add to the new site. Then do a Client push forcing repair/reinstall of the client.

if you want to slow roll, you can only "move" some of the boundary groups. Most important is to never allow the boundaries between sites to overlap. The Agents get all pissy when that happens.

I definitely recommend "green field" type upgrades for the most part on SCCM. The exception to this is you may want to use the migration tool to bring over any application packages, OSD task Sequences, Images, etc you have created.

Zaepho
Oct 31, 2013

Sheep posted:

I'm still trying to figure out what to do for our company since we don't have Azure and would love some ideas.

I still really like Direct Access on server 2012 R2 with Windows 8.1 clients. For the overwhelming majority of the time, it just works. Being able to sit down with any internet connection that allows HTTPS outbound and just be on the internal network as if I was in the office is priceless. We still have anyconnect through our ASA for backup traditional VPN in case the DA box goes down (I didn't bother to cluster it since we're small and it's not critical for the majority of our users) but I haven't had to use that in probably a year or more. I can talk to servers as if i was in the office, i get patches, software deployments and can change my AD password. It all just works for us.

Zaepho
Oct 31, 2013

Tab8715 posted:

I'm setting up a Sharepoint farm but I'm getting stuck with this user that's a domain admin and needs permissions to create computer objects. In ADUC I'm selecting advanced, user properties, security, advanced. In the Permission Entry I'm selecting add, choosing the same user but there isn't an option for Create Computer Objects.

I think I might have given that to this user before but I don't understand why it wouldn't show up?

I think you're looking for delegation not security.

Right Click an OU (or the domain itself), choose Delegate Control and walk through the wizard.

Zaepho
Oct 31, 2013

stevewm posted:

Maybe someone has an idea how to achieve this here...

I have several users that utilize some design software that constantly updates itself. These updates require UAC elevation to work. Normally I do not allow local admin access anywhere, but had to make an exception for these users.

To that end I created a GPO to add a security group "Workstation Admins" to the builtin\administrators group. Any users I want to give local admin I add them to this security group. The problem is this then allows them to access the admin$ or c$ administrative shares on any domain joined machine! Edit: We utilize the admin shares for pushing software, so cannot disable them.

Is there any way to make it so local administrators cannot access administrative shares? Or prevent them from accessing domain network resources?

Typically the app needs Elevation because it is writing to a specific protected location (in this case probably Program Files). Grant the local users group full control over the application's folder and see if that removes the need for elevation when updating this app.

Zaepho
Oct 31, 2013

Gerdalti posted:

I want to learn SCCM (and later, SCOM). Where should I start? I don't seem to be able to focus on just browsing the info on Technet (it's organized poorly IMO).

Along with the other good info above. Check out MyITForum.com specifically their SCCM Email list. Really great information goes through there and it's populated by a ton of SCCM MVPs. In general the SCM Community is massive and generally very helpful.

The most important thing to know about SCCM is the speed at which is operates is inversely proportional to how much you want something to happen. Trying to push that critical 0 day patch to everyone in the company? It's going to take its sweet sweet time. Accidentally deploy Windows 7 image to all servers? Lightning fast!

SCOM is a bit more difficult to find great info on. Lots of blog articles on specific stuff but no great centralized location for good info and the Community is pretty weak as it's largely corporate guys who do scom as a part of their normal job rather than being a specialty. Set it up, monitor some stuff, and fiddle with it until you "Get" how the classes, discoveries, monitors and such work then start doing some low level MP authoring to ensure your liver is properly damaged.

Zaepho
Oct 31, 2013

GreenNight posted:

Makes sense. None of the updated imaging tools are out yet for Windows 10.

I thought I saw an MDT release at a minimum.
SCCM 2016 should be dropping pretty quickly. Last I heard the plan was within 90 days of the Windows 10 Launch.

Edit: I'm terrible. It was the Preview of MDT that had windows to (LTI only) support.

Zaepho
Oct 31, 2013

Potato Salad posted:

With this thread covering enterprise topics, is it the de-facto destination for SCCM discussion?

Everything I'm reading about application vs package deployment points to application catalog deployments lacking the ability to install upon winlogon -- as is possible in gpo or sccm package deployment. Being somewhat new to the sccm 2012 scene, I'm left scratching my head a little regarding precisely why.

The application deployment evaluation cycle is what triggers application deployments and it runs on a different schedule without tying into winlogon (that doesn't answer the why part...Microsoft?). Unless you absolutely have to have it run at Winlogon, use an app. For your own sanity's sake and all that is frigging sacred.. USE APPS!! Unless you have a VERY compelling reason (reason.. not excuse) not to.

I'm willing to bet it was a feature they felt was either unnecessary or they were unable to get it in before their deadlines (To ship is to Choose).

Zaepho
Oct 31, 2013

Moey posted:

Speaking of patching, is there a good solution that you folks advise for managing patches on remote computers? For in house stuff, WSUS fits my needs, but we have a ton of laptops floating around that do not connect back to our network too often.

SCCM with Internet Based Client Management with bonus features for everything else internal and external. or... Intune i guess? for MS solutions.

Zaepho
Oct 31, 2013

johnnyonetime posted:

Here's a great article on how to encrypt the drive during OSD. After thinking about it I think this might be what you are asking for:
http://www.windows-noob.com/forums/...nager-2012-sp1/

Just a note here, this significantly increases the time that it takes to complete an OSD, BUT your drive is already encrypted. Not an issue per-se but something to at least consider/be cognizant of.

Zaepho
Oct 31, 2013

Walked posted:

I'm a non-DBA helping figure out some DBA type tasks.

I'd like to setup a centralized performance logging database for SQL Server performance counters of the server. (e.g. disk queue length, processor utilizations, etc)
Ideally put into a database where we can generate reports or query based on time to correclate when an issue is reported by users/developers.

Documentation on approach to this seems kinda thin; there's some info about using perfmon and piping it to a text file or CSV and importing it; and a few articles about using ODBC connections to write to a database, but those seem out of date and perhaps a bit inelegant.

Any pointers? Happy to put forward a product that will do this for us. We dont need hyper-granular information, nor a ton of performance counters (in fact we've had no issues on that front, I'm just trying to help be proactive).

Any decent general system monitoring solution should so at least most of this. Assuming MSSQL and a windows environment, take a look at System Center Operations Manager (SCOM) and specifically the SQL Management Pack for Health Monitoring as well as some general performance monitoring. You can also add pretty much anything else you might want to monitor into it (Additional Perf Counters or logging wait types for instance).

Zaepho
Oct 31, 2013

GreenNight posted:

I just wish you can download CU via Windows Update instead of RSS'ing this dudes site.

https://buildnumbers.wordpress.com/

You mean like the entire rest of the System Center Suite? Me too. SCCM has to be extra special though and do their own poo poo, on their own schedule. Much like the software itself (SMS = Slow Moving Software).

In fact I always say that the speed of any process in SCCM is inversely proportional to how much you want it to happen. Trying to push that critical 0-Day patch to every machine in the enterprise? SLLLLOOOOWWWWW! Accidentally deploy Windows 10 to "All Systems", poo poo poo poo poo poo it already happened.

I too have a love/hate relationship with SCCM. Infinite Cosmic Power! Enormous Pain in the rear end!

Zaepho
Oct 31, 2013

Walked posted:

two systems that have had this in common is they were both SCCM clients, both inactive state, and both disconnected from the domain for 60days (yay developers hoarding old hardware "just in case")

Use the maintenance task to Nuke the client flag on crap that hasn't sent a heartbeat in a couple weeks (beyond the max vacation time) and the limits on AD Discovery to not discover machines that haven't logged in for a similar period of time. Keeps your deployments and crap a LOT cleaner.

Machines that magically return will re-register and continue on like normal so no major issues there (unless you're using a lot of packages and aren't handling if things are installed of not in some fashion. Don't use packages if you can do it as an App).

Zaepho
Oct 31, 2013

FISHMANPET posted:

Can anyone explain to me what Orchestrator is for? Not what it does, I get that, but really, [i]what is it for?[/url]

It gives you this fancy interface for automating tasks, but more often than not it just ends up being that the integration packs don't do what you want, so you're just back to writing Powershell code anyway. At which point I'd rather just write clean powershell code instead of all the nonsense it takes to shove a script into Orchestrator.

You understand it completely.

The vision is drag and drop automation with Integration Packs providing all of the actions you need. The reality is as you have already said a tool to string together powershell scripts. Granted there are some benefits in having Orchestrator as the launch platform for them such as Allowing unpriledged users do tasks that would require elevated privledges or using it as an integration piece with Service Manager (building workflows into SCSM MPs is horrific compared to simply connecting up a runbook activity).

Zaepho
Oct 31, 2013

NevergirlsOFFICIAL posted:

I used this regularly and it's great but I never scripted it, just used gui

Page 41 of their use guide is Automating Enterprise Migrations so looks like a winner

Zaepho
Oct 31, 2013

skipdogg posted:

No, but it's still generally accepted best practice, and Microsoft recommended to only use Office 64bit if you have an extremely good reason to (Excel datasheets of massive sizes). Otherwise 32bit Office Apps are still recommended. Too many legacy plugins, code, and other bullshit. Just because something is more bits doesn't make it better.

Without crazies running 64Bit there would be no pressure on the Plugin/Code/Bullshit providers to update to support 64bit. That said I'm giving in and ripping out Office 64 and installing 32bit on my laptop at least until my next rebuild.

Zaepho
Oct 31, 2013

skipdogg posted:

Smaller shops that don't have KMS and the like setup would have to install and activate everything manually. Or at least script it anyway. Before I setup MDT I had a big post image script that did all sorts of things for my environment.

MAK keys work just fine in Office and can auto activate with no problem. The key just gets baked into the office install msp.

Zaepho
Oct 31, 2013

LmaoTheKid posted:

Would I be better off adding an existing AD setup to my domain from another company with 5 or 6 employees or just starting over with them on our current setup?

Is there any way I can map their new profile locally to the old one?

You can do this with ADMT but it might be more effort than it's worth.

With ADMT you would build a trust between the 2 forests, use ADMT to build the AD Accounts and "migrate" the workstation. The workstation migration piece will flip the domain and re-ACL/re-point the profiles to the new SIDs.

It's not a quick and easy thing to do. With 5-6 users you're probably better off issuing a new machine, copy profile contents from old machine to new.

Zaepho
Oct 31, 2013

LmaoTheKid posted:

Instead of a new machine, anything I should look out for by leaving the old domain, joining the new one, and copying over the profile locally?

Not really. the only annoyance here is the possibility for Wonky settings from old GPOs (there probably aren't any with this few users) and the fact that there is no really great backout if things go pear shaped. Granted you should be able to push forward and recover without too much pain in this case.

Adbot
ADBOT LOVES YOU

Zaepho
Oct 31, 2013

Docjowles posted:

I'd also settle for a way to do it via Group Policy if it's not possible to script. Anything but clicking around the GUI every time I bring up a new server.

I'm sure it's scriptable, but I usually just use GPOs to handle certificate Auto-Enrollment. Bonus is that it will automagically handle renewals as well.

http://windowsitpro.com/security/setting-certificate-autoenrollment-feature-windows-public-key-infrastructure

You can also handle all of the WinRM stuff for Powershell Remoting (I assume that's what you're actual end goal is here) via GPO.

  • 1
  • 2
  • 3
  • 4
  • 5
  • Post
  • Reply