|
Thanks Ants posted:Whoever made the tool did http://technet.microsoft.com/en-us/library/ff829850.aspx wow.. as horible as that file name is... Bitlocker Drive Encryption HardDrive ConFiGuration.exe
|
# ¿ Nov 20, 2014 17:17 |
|
|
# ¿ Apr 26, 2024 05:35 |
|
orange sky posted:Yep I know it by heart because it (kind of) makes sense. You can do it no problem, just tell it that it's behind a NAT and it's OK with it. You'll still want to do a dual nic setup to get the full effect. I've seens pretty good blog entry on doing just this. I'll have to see if i can dig it up. Also do it on 2012R2 and plan to stick a hardware load balancer in front of a couple servers. This will make your life much better since it's not a windows server connected to both the internet and your internal network. This is probably the best article on building a lab for DA: http://blogs.technet.com/b/meamcs/archive/2012/05/14/windows-server-2012-direct-access-part-2-how-to-build-a-test-lab.aspx it's not amazing but should give you enough to get it started. Zaepho fucked around with this message at 18:24 on Nov 20, 2014 |
# ¿ Nov 20, 2014 18:12 |
|
orange sky posted:I'd love to take a look at that blog entry, thanks. This will go into production with NAT though, so why would I use a dual nic setup, load balancing? What concerns me the most is setting up the NAT, since I'm using a lovely 3G pen. I should probably try this at home with my decent router and NAT port 443 to the internal switch IP used on the VM right? If you can do the NAT bit in your lab, definitely do it. the dual NIC setup would be for if you can't make the NAT parts work for you lab. We run DA in a single NIC setup behind NAT in our environment and it works great. Aside from that do at least a 2 node cluster for the DA servers and a super clustered Network Location Server setup (NLS is just any SSL Website accessible internally but not externally or over DA). Putting the DA boxes behind a hardware load balancer with "sticky" sessions is the way to go from the outside. Just don't try to terminate the SSL session on the load balancer. DA is awesome but it's certainly different from traditional VPN solutions (in a good way I would suggest). If you can manage to get IPv6 running internal to your network, it's even more awesome.
|
# ¿ Nov 20, 2014 18:35 |
|
orange sky posted:Do any of you guys have some info on the advantages/disadvantages of using DirectAccess with Public IP's / Behind Edge / Using only one interface behind edge? As in, what exactly do we win or lose by choosing one option and not the other. You'd really do me a huge favour if you had something. Behind Edge. Don't place windows servers directly on public IPs unless you absolutely have to. Just push port 443 back. Like you found you lose Teredo and 6to4. Both of which are unnecessary.
|
# ¿ Dec 4, 2014 19:43 |
|
lol internet. posted:Does Windows Storage Pools/ReFS do anything magical in performance compared to hardware raid? (Home solution.) performance? no. ReFS is great since it's journaled and the write is read to verify correctness before the write action is considered complete. Storage pools mean that the drives don't all have to be the same which is pretty awesome, and you can swap drives around pretty easily.
|
# ¿ Jan 2, 2015 18:12 |
|
GreenNight posted:I meant the SCCM client on a server, not the server software suite. As I understand it (and as far as I'm concerned MS licensing is a dark black art designed so that Microsoft can charge anyone anything they want to), the System Center tools are no longer licensed individually, only as the entire Suite. So you get SCOM, SCCM, DPM, etc all rolled into the one license. It makes a great case for my as a consultant to be able to say "Hey since we just rolled SCOM on all of your servers, lets roll out SCCM and get rid of X Other Patching product and save you some money! *cough*so you can pay me some more*cough*" This works really well when you license the physical servers with DataCenter and run a shitload of VMs on them. The DataCenter physical license passes down to all of the virtual guests.
|
# ¿ Jan 7, 2015 17:28 |
|
BaseballPCHiker posted:Is there a way in SCCM 2012 to make a package and deployment use a specific distribution point? I setup a secondary dp and uploaded the content and when I deployed the software update package it seemed to still be going over our VPN to the primary site. I thought it would just automatically take the content from the closest dp but I guess not. Content location is dependent on the boundary group that the DP is assigned to and the which Boundary Group the client falls into at the time. Plus add in the ability to fall back to another DP if the content is not available within the current boundary group. What we do for boundary groups on our engagements is to create "Site Assignment" boundary groups that are used only for assigning clients to the correct site. Secondly we setup "Content Boundary Groups" that are used expressly for directing clients to the appropriate DP for their location.
|
# ¿ Jan 22, 2015 22:39 |
|
BaseballPCHiker posted:Thanks for the tip. Gives me something to look into. Do you have any general books or sites to recommend? I've been reading the windows-noob forum guides, the deploy-happiness blog, and bought the System Center mastering the fundamentals book as well. I would echo what the others have said and add on that the MyITForum community is a great resource for SCCM. Its your best link to pretty much every SCCM MVP out there. Try to make it out to Ignite and see if you can network your way into chatting with some of the SCCM Community big names and you'll get a LOT of information that you'll never get from any book or training class. SCCM is a tool you just have to work with and eventually it will click and things will start to make more sense. Unfortunately there's a lot of stuff to understand at the foundation level to be able to make the most of SCCM so it'll take some time.
|
# ¿ Jan 23, 2015 15:23 |
|
BaseballPCHiker posted:Anyone have any recommendations on MSI building software? I've been using Orca for a bit but feel like there's got to be some paid software out there that will work better. WIX is great since you can check in the MSI build XMLs with the rest of your code. The devs at a software company i worked for absolutely loved it. nANT would build all the code run all of the tests, and build the installers and drop them for final testing and acceptance and bam out the door they would go. once it was setup and running it was easy to distribute stuff as well as easy to update installer processes. http://wixtoolset.org/
|
# ¿ Jan 29, 2015 17:55 |
|
Tab8715 posted:The gently caress? Was this an official Microsoft step? as a troubleshooting step this is fine. It tells you IF the firewall is a factor. You can then choose to fix the issue with the firewall since you now know it's a firewall issue. Pretty typical process of elimination troubleshooting in my book.
|
# ¿ Jan 29, 2015 23:35 |
|
5er posted:I'm way ahead of you, and the wisdom you speak is something I have to dispense almost daily to others. I just wanted to know if the problem as I described it is fixable, because some dumb gently caress is going to break things along the lines I described and I might get looked at to un-gently caress it up. Ideally the system drive should be mirrored (at the hardware level), THEN build you Storage Space with all of the other drives using at least a parity space, and replace drives as needed there. Don't use anything from the OS disk in the storage space.
|
# ¿ Feb 6, 2015 04:02 |
|
alanthecat posted:Interesting. What does it do that Group Policy shouldn't do? (i.e. why aren't Microsoft baking those features into GP?) Application/System component installations. For instance you could have a DSC policy that says that ServerB is a MyApp web frontend. Apply the policy and powershell can install all of the pre-requisites, install and configure IIS and MyApp and even add it to the load balancer if you code it up that way.
|
# ¿ Feb 12, 2015 23:59 |
|
Hadlock posted:If I want to do two domains on the same LAN I need to setup each device manually for each device that is domain joined right? And then DHCP for IoT that don't care about domain and just want to talk to the internet? If you configure DD-WRT to forward DNS requests for specific domains to the DCs for those domains, you'll be fine. This basically involves setting up Stub records in the DD-WRT named config. No need for manual config on the domain joined devices, DNS will handle it just like it's supposed to.
|
# ¿ Feb 16, 2015 17:21 |
|
Hadlock posted:sounds like I need to research DNS further. This is never a bad idea. Between everything today being heavily reliant on DNS and IPv6 "Coming SoonTM" when you will never be able to memorize all of the important IP addresses anymore, DNS is a foundation technology that you should have a really solid understanding of.
|
# ¿ Feb 16, 2015 18:56 |
|
Cosmic D posted:Also they still think it's fine to ask for a user's password so they don't have to go deskside and setup everything. Being new to the industry, is this acceptable business practice? This teaches users it's OK to give up all of their usernames, passwords, social security numbers, small children, etc to anyone who purports to be "The Helpdesk". Like those guys who cold call from "Microsoft" because "Your computer has a virus! Were here to help fix it! Because we're Microsoft!" Don't train idiot users to be bigger idiots!
|
# ¿ Feb 18, 2015 03:46 |
|
ghostinmyshell posted:Does anyone know if this live event was recorded and put up somewhere to view? I had to miss it I too had to miss and was hoping to see a cool email in my inbox with a recording of the session. Hasn't show up yet so I'm guess probably not going to happen. I would really love on though.
|
# ¿ Mar 2, 2015 23:39 |
|
BaseballPCHiker posted:Anyone ever have any experience trying to deploy a vbscript through sccm? I've done plenty of batch files when needed but this is my first vbs I've gotten stuck with. The company uses this ancient script to pull info from AD and create an email signature that saves locally in users Outlook profiles. I tried in vain to explain how a hub transport rule would be much more simple, elegant and thorough with no luck. If I run cscript.exe //nologo \\server\share\sig.vbs and deploy it as mandatory it says that it runs successfully but I dont actually see the signature applied. I know the script works if I run it manually. Really want to re-do this thing in powershell or use the transport rule but cant convince the higher ups to let me spend the time to do it. Is the package running interactively with the logged on user? If not, System has an awesome sig everywhere on you network! Remember that SCCM agent runs as System and by default runs all packages within it's own context. NOT the user's session.
|
# ¿ Mar 18, 2015 18:29 |
|
Hadlock posted:Am I completely crazy here to want to run these 70% of machines headless? What can I do or say to explain the benefits here? Lots of resistance since this is not "the way we've always done it". The goal is 100% headless. Start with the 70% and then work towards eliminating apps that need to be logged in on the console to function.
|
# ¿ Mar 25, 2015 18:11 |
|
mayodreams posted:The big sticking point for us is that 90% of our user population is still on XP and a lot of these 'apps' are only 16/32 bit so we expect a lot of pain with the 8.1 migration this year. Any pain on XP can easily be mitigated by $10,000/incident support calls! Rack up a few of those and the finance guys will untwist a lot of panties for you in a drat hurry.
|
# ¿ Mar 25, 2015 21:06 |
|
FISHMANPET posted:I'm gonna post this in the Storage thread too, but has anyone seen problems with slow storage performance on Server 2012 R2? I've got an open case with Microsoft but we're a month in and still seem to just be flailing randomly at even identifying a problem. I've heard mumblings of others having problems, but wondering if anyone has noticed anything. what kind of storage and what kind of slowness are we looking at?
|
# ¿ Apr 13, 2015 20:34 |
|
Tony Montana posted:This is Server 2012. So the fsmo seize worked but AD is broken and it's acting as if it's not a domain controller. Now Server 2012 through the Generation ID attrib which is exposed in VMWare (5.5 is our prod) knows that it's been cloned and it's not the VM it was. This kicks off the 'Directory Security' features which were recently introduced, but that shouldn't prevent it operating as a DC. However I'm getting some errors in the DNS logs about AD being broken so DNS can't start properly and AD logs about DNS being broken so it can't start properly. keep calm and ignore it for a half hour or so. AD does this crazy thing where it tries to do an initial sync from another DC after it boots up before it starts acting as a DC. Add into that the fact that DNS won't start until AD does and you're in for a good time! After about half an hour AD gives up trying an initial sync and starts, allowing DNS to start up as well. Clean out the rest of the domain controllers in your "new" lab domain and you should be able to avoid the start-up delay in the future.
|
# ¿ Apr 15, 2015 14:49 |
|
BaseballPCHiker posted:Yeah going from SP1 to R2 looks relatively harmless. In any event what I'm ending up doing is setting up a new VM with more resources thrown at it then the old server and installing SCCM 2012 R2 CU4 on it. Once I get MDT and WAIK, WSUS and eveything else setup then I'll start removing roles from the current primary site and adding them to the new one. If I'm reading all of the technet articles correctly I can then create a boundary group and add the second new server. Once all of the roles are reassigned to the new server and the clients are pointing there I can decommission the old one. I hope it works like that at least. You making things a little harder on yourself than needed. Stand up a brand new site (NEW SITE CODE!!!) with all the roles that it needs, all configured and happy. Nuke the Boundary Groups on the Old site, Add to the new site. Then do a Client push forcing repair/reinstall of the client. if you want to slow roll, you can only "move" some of the boundary groups. Most important is to never allow the boundaries between sites to overlap. The Agents get all pissy when that happens. I definitely recommend "green field" type upgrades for the most part on SCCM. The exception to this is you may want to use the migration tool to bring over any application packages, OSD task Sequences, Images, etc you have created.
|
# ¿ Apr 22, 2015 15:08 |
|
Sheep posted:I'm still trying to figure out what to do for our company since we don't have Azure and would love some ideas. I still really like Direct Access on server 2012 R2 with Windows 8.1 clients. For the overwhelming majority of the time, it just works. Being able to sit down with any internet connection that allows HTTPS outbound and just be on the internal network as if I was in the office is priceless. We still have anyconnect through our ASA for backup traditional VPN in case the DA box goes down (I didn't bother to cluster it since we're small and it's not critical for the majority of our users) but I haven't had to use that in probably a year or more. I can talk to servers as if i was in the office, i get patches, software deployments and can change my AD password. It all just works for us.
|
# ¿ Apr 30, 2015 16:37 |
|
Tab8715 posted:I'm setting up a Sharepoint farm but I'm getting stuck with this user that's a domain admin and needs permissions to create computer objects. In ADUC I'm selecting advanced, user properties, security, advanced. In the Permission Entry I'm selecting add, choosing the same user but there isn't an option for Create Computer Objects. I think you're looking for delegation not security. Right Click an OU (or the domain itself), choose Delegate Control and walk through the wizard.
|
# ¿ Apr 30, 2015 18:10 |
|
stevewm posted:Maybe someone has an idea how to achieve this here... Typically the app needs Elevation because it is writing to a specific protected location (in this case probably Program Files). Grant the local users group full control over the application's folder and see if that removes the need for elevation when updating this app.
|
# ¿ Jul 11, 2015 05:18 |
|
Gerdalti posted:I want to learn SCCM (and later, SCOM). Where should I start? I don't seem to be able to focus on just browsing the info on Technet (it's organized poorly IMO). Along with the other good info above. Check out MyITForum.com specifically their SCCM Email list. Really great information goes through there and it's populated by a ton of SCCM MVPs. In general the SCM Community is massive and generally very helpful. The most important thing to know about SCCM is the speed at which is operates is inversely proportional to how much you want something to happen. Trying to push that critical 0 day patch to everyone in the company? It's going to take its sweet sweet time. Accidentally deploy Windows 7 image to all servers? Lightning fast! SCOM is a bit more difficult to find great info on. Lots of blog articles on specific stuff but no great centralized location for good info and the Community is pretty weak as it's largely corporate guys who do scom as a part of their normal job rather than being a specialty. Set it up, monitor some stuff, and fiddle with it until you "Get" how the classes, discoveries, monitors and such work then start doing some low level MP authoring to ensure your liver is properly damaged.
|
# ¿ Jul 23, 2015 18:15 |
|
GreenNight posted:Makes sense. None of the updated imaging tools are out yet for Windows 10. I thought I saw an MDT release at a minimum. SCCM 2016 should be dropping pretty quickly. Last I heard the plan was within 90 days of the Windows 10 Launch. Edit: I'm terrible. It was the Preview of MDT that had windows to (LTI only) support.
|
# ¿ Aug 4, 2015 20:38 |
|
Potato Salad posted:With this thread covering enterprise topics, is it the de-facto destination for SCCM discussion? The application deployment evaluation cycle is what triggers application deployments and it runs on a different schedule without tying into winlogon (that doesn't answer the why part...Microsoft?). Unless you absolutely have to have it run at Winlogon, use an app. For your own sanity's sake and all that is frigging sacred.. USE APPS!! Unless you have a VERY compelling reason (reason.. not excuse) not to. I'm willing to bet it was a feature they felt was either unnecessary or they were unable to get it in before their deadlines (To ship is to Choose).
|
# ¿ Aug 6, 2015 17:42 |
|
Moey posted:Speaking of patching, is there a good solution that you folks advise for managing patches on remote computers? For in house stuff, WSUS fits my needs, but we have a ton of laptops floating around that do not connect back to our network too often. SCCM with Internet Based Client Management with bonus features for everything else internal and external. or... Intune i guess? for MS solutions.
|
# ¿ Aug 11, 2015 20:06 |
|
johnnyonetime posted:Here's a great article on how to encrypt the drive during OSD. After thinking about it I think this might be what you are asking for: Just a note here, this significantly increases the time that it takes to complete an OSD, BUT your drive is already encrypted. Not an issue per-se but something to at least consider/be cognizant of.
|
# ¿ Aug 22, 2015 19:30 |
|
Walked posted:I'm a non-DBA helping figure out some DBA type tasks. Any decent general system monitoring solution should so at least most of this. Assuming MSSQL and a windows environment, take a look at System Center Operations Manager (SCOM) and specifically the SQL Management Pack for Health Monitoring as well as some general performance monitoring. You can also add pretty much anything else you might want to monitor into it (Additional Perf Counters or logging wait types for instance).
|
# ¿ Aug 31, 2015 23:17 |
|
GreenNight posted:I just wish you can download CU via Windows Update instead of RSS'ing this dudes site. You mean like the entire rest of the System Center Suite? Me too. SCCM has to be extra special though and do their own poo poo, on their own schedule. Much like the software itself (SMS = Slow Moving Software). In fact I always say that the speed of any process in SCCM is inversely proportional to how much you want it to happen. Trying to push that critical 0-Day patch to every machine in the enterprise? SLLLLOOOOWWWWW! Accidentally deploy Windows 10 to "All Systems", poo poo poo poo poo poo it already happened. I too have a love/hate relationship with SCCM. Infinite Cosmic Power! Enormous Pain in the rear end!
|
# ¿ Oct 23, 2015 14:29 |
|
Walked posted:two systems that have had this in common is they were both SCCM clients, both inactive state, and both disconnected from the domain for 60days (yay developers hoarding old hardware "just in case") Use the maintenance task to Nuke the client flag on crap that hasn't sent a heartbeat in a couple weeks (beyond the max vacation time) and the limits on AD Discovery to not discover machines that haven't logged in for a similar period of time. Keeps your deployments and crap a LOT cleaner. Machines that magically return will re-register and continue on like normal so no major issues there (unless you're using a lot of packages and aren't handling if things are installed of not in some fashion. Don't use packages if you can do it as an App).
|
# ¿ Dec 3, 2015 18:53 |
|
FISHMANPET posted:Can anyone explain to me what Orchestrator is for? Not what it does, I get that, but really, [i]what is it for?[/url] You understand it completely. The vision is drag and drop automation with Integration Packs providing all of the actions you need. The reality is as you have already said a tool to string together powershell scripts. Granted there are some benefits in having Orchestrator as the launch platform for them such as Allowing unpriledged users do tasks that would require elevated privledges or using it as an integration piece with Service Manager (building workflows into SCSM MPs is horrific compared to simply connecting up a runbook activity).
|
# ¿ Dec 12, 2015 21:07 |
|
NevergirlsOFFICIAL posted:I used this regularly and it's great but I never scripted it, just used gui Page 41 of their use guide is Automating Enterprise Migrations so looks like a winner
|
# ¿ Jan 8, 2016 02:24 |
|
skipdogg posted:No, but it's still generally accepted best practice, and Microsoft recommended to only use Office 64bit if you have an extremely good reason to (Excel datasheets of massive sizes). Otherwise 32bit Office Apps are still recommended. Too many legacy plugins, code, and other bullshit. Just because something is more bits doesn't make it better. Without crazies running 64Bit there would be no pressure on the Plugin/Code/Bullshit providers to update to support 64bit. That said I'm giving in and ripping out Office 64 and installing 32bit on my laptop at least until my next rebuild.
|
# ¿ Feb 2, 2016 19:41 |
|
skipdogg posted:Smaller shops that don't have KMS and the like setup would have to install and activate everything manually. Or at least script it anyway. Before I setup MDT I had a big post image script that did all sorts of things for my environment. MAK keys work just fine in Office and can auto activate with no problem. The key just gets baked into the office install msp.
|
# ¿ Feb 3, 2016 00:15 |
|
LmaoTheKid posted:Would I be better off adding an existing AD setup to my domain from another company with 5 or 6 employees or just starting over with them on our current setup? You can do this with ADMT but it might be more effort than it's worth. With ADMT you would build a trust between the 2 forests, use ADMT to build the AD Accounts and "migrate" the workstation. The workstation migration piece will flip the domain and re-ACL/re-point the profiles to the new SIDs. It's not a quick and easy thing to do. With 5-6 users you're probably better off issuing a new machine, copy profile contents from old machine to new.
|
# ¿ May 3, 2016 20:12 |
|
LmaoTheKid posted:Instead of a new machine, anything I should look out for by leaving the old domain, joining the new one, and copying over the profile locally? Not really. the only annoyance here is the possibility for Wonky settings from old GPOs (there probably aren't any with this few users) and the fact that there is no really great backout if things go pear shaped. Granted you should be able to push forward and recover without too much pain in this case.
|
# ¿ May 3, 2016 22:21 |
|
|
# ¿ Apr 26, 2024 05:35 |
|
Docjowles posted:I'd also settle for a way to do it via Group Policy if it's not possible to script. Anything but clicking around the GUI every time I bring up a new server. I'm sure it's scriptable, but I usually just use GPOs to handle certificate Auto-Enrollment. Bonus is that it will automagically handle renewals as well. http://windowsitpro.com/security/setting-certificate-autoenrollment-feature-windows-public-key-infrastructure You can also handle all of the WinRM stuff for Powershell Remoting (I assume that's what you're actual end goal is here) via GPO.
|
# ¿ Jun 10, 2016 14:15 |