Register a SA Forums Account here!
JOINING THE SA FORUMS WILL REMOVE THIS BIG AD, THE ANNOYING UNDERLINED ADS, AND STUPID INTERSTITIAL ADS!!!

You can: log in, read the tech support FAQ, or request your lost password. This dumb message (and those ads) will appear on every screen until you register! Get rid of this crap by registering your own SA Forums Account and joining roughly 150,000 Goons, for the one-time price of $9.95! We charge money because it costs us money per month for bills, and since we don't believe in showing ads to our users, we try to make the money back through forum registrations.
 
  • Post
  • Reply
Walked
Apr 14, 2003

Is this possible?

I've got a script that requires our domain admin account to run. So right-click, run as adminsitrator, good to go.

Then I've got it launching an outlook window and pre-populating the mail message with the results of the report.

Problem is, I cant get them to work together. If I do them together, either the administrative checks dont run (needs admin priveleges) or the outlook application wont launch (no outlook profile for the domain admin user).

Is there a way to tell it to execute a code block as the currently logged in user on a machine? I cant find much.

Adbot
ADBOT LOVES YOU

Victor
Jun 18, 2004
Walked, you might look for PowerShell equivalents for runas, or just use runas. (run as)

adaz
Mar 7, 2009

Walked posted:

Is this possible?

I've got a script that requires our domain admin account to run. So right-click, run as adminsitrator, good to go.

Then I've got it launching an outlook window and pre-populating the mail message with the results of the report.

Problem is, I cant get them to work together. If I do them together, either the administrative checks dont run (needs admin priveleges) or the outlook application wont launch (no outlook profile for the domain admin user).

Is there a way to tell it to execute a code block as the currently logged in user on a machine? I cant find much.

Outlook is such a bitch to work with and I've run into similar problems like you're describing. The issue is the profile problems, and I'll try to not rage too much here but this is so annoying to work around. There is a way in the ComObject to specify a different username/password but I've never been able to get it to work (probably because we're running on a different domain in a different forest than our Exchange domain). You might want to check into that.

So your options are to

1.) use system.net.mail to send your email through a friendly smtp server bypassing outlook altogether

2.) Check in the outlook object for how to log on as a different user (http://msdn.microsoft.com/en-us/library/bb208225%28v=office.12%29.aspx)

3.) use the old runas.exe

code:
& 'C:\windows\system32\runas.exe' /user:mydomain\myusername "C:\Program Files\Microsoft Office\Office12\Outlook.Exe"
4.) use exchange web services which I have very little experience with other than briefly playing around for a few hours one bored day.

adaz fucked around with this message at 22:09 on Oct 22, 2010

Walked
Apr 14, 2003

adaz posted:

Outlook is such a bitch to work with and I've run into similar problems like you're describing. The issue is the profile problems, and I'll try to not rage too much here but this is so annoying to work around. There is a way in the ComObject to specify a different username/password but I've never been able to get it to work (probably because we're running on a different domain in a different forest than our Exchange domain). You might want to check into that.

So your options are to

1.) use system.net.mail to send your email through a friendly smtp server bypassing outlook altogether

2.) Check in the outlook object for how to log on as a different user (http://msdn.microsoft.com/en-us/library/bb208225%28v=office.12%29.aspx)

3.) use the old runas.exe

code:
& 'C:\windows\system32\runas.exe' /user:mydomain\myusername "C:\Program Files\Microsoft Office\Office12\Outlook.Exe"
4.) use exchange web services which I have very little experience with other than briefly playing around for a few hours one bored day.

SMTP isnt an option; user needs a chance to review the message before it fires off.

gently caress it. Not that big a deal.
Question: Would a start-process from Powershell 2.0 work, if I kicked off a secondary script to do the outlook tasks? Such a bad work around though.

adaz
Mar 7, 2009

Walked posted:

SMTP isnt an option; user needs a chance to review the message before it fires off.

gently caress it. Not that big a deal.
Question: Would a start-process from Powershell 2.0 work, if I kicked off a secondary script to do the outlook tasks? Such a bad work around though.

Well, something like this should work if that's the case (haven't tested it, replace outlookprofile with name of profile you want to login to)-

code:
$cred = Get-Credential
start-process outlook -credentials $cred
$OutlookApp = New-Object -COM Outlook.Application 
$OutlookMapiNamespace = $OutlookApp.GetNamespace("mapi")
$OutlookMapiNamespace.Logon("OutlookProfile")

Walked
Apr 14, 2003

adaz posted:

Well, something like this should work if that's the case (haven't tested it, replace outlookprofile with name of profile you want to login to)-

code:
$cred = Get-Credential
start-process outlook -credentials $cred
$OutlookApp = New-Object -COM Outlook.Application 
$OutlookMapiNamespace = $OutlookApp.GetNamespace("mapi")
$OutlookMapiNamespace.Logon("OutlookProfile")

Probably, trying to avoid prompting for credentials as we use smart card authentication for regular user logins and I'm trying to keep it streamlined as possible.

But we'll see. Toying with it.

Dessert Rose
May 17, 2004

awoken in control of a lucid deep dream...
I'm trying to replace some text in a file, and Powershell is thwarting my attempts to make it useful.

Basically I have a .prf exported from the Office Customization Tool, and I want to replace the account name with whatever the user puts in. According to everyone on the internet, I want to do some variant of:

code:
(Get-Content input.prf) | foreach {$_ -replace "LOGIN", "BBQ"} | Set-Content new.prf
but it does worse than not working - it mangles the file and doesn't actually replace what I'm trying to replace.

In the case of the above code, it destroys line endings horribly. I also tried:

code:
$what = Get-Content input.prf
$what -replace "LOGIN", "BBQ" > new.prf
but that not only doesn't replace the login text but intersperses NULLs between every character in the output file.

Powershell can't be this incompetent at working with text files. What am I missing? I'm about to punt and just do it with C#.

Victor
Jun 18, 2004
Ryouga Inverse, try this:
code:
(Get-Content input.prf) -replace "LOGIN", "BBQ" | Set-Content -Path new.prf
I'm not sure why there is no PoSH version of sed — the GnuWin32 version switches your documents to Unix-style \n line endings.

Your > new.prf may have caused a bunch of NULLs because it was trying to output Unicode; read up on $OutputEncoding.

necrobobsledder
Mar 21, 2005
Lay down your soul to the gods rock 'n roll
Nap Ghost
Anyone familiar with PowerWF Studio or deployed it in their organizations? I'm trying to understand how I can do distributed applications basically using Powershell scripts as atomic functions and am trying to see if there's a possibility of creating a poor man's memcached or Redis through a platform utilizing just Powershell scripts. I'd hate having to write C# to do stuff that's so... mundane in a full-featured professional developer's language like I have to do for many sysadmin tasks, so it'd be nice to be able to leverage Powershell scripts to scale out to hundreds of scripts / second with some concurrency handling. I don't think I could get sign-off on deploying Hadoop + Zookeeper to handle only millions of entries in random enterprise environments so they can just run a lot of tasks nor do I want to recommend deploying software I know is poo poo (that's standard for these organizations to do automation).

Walked
Apr 14, 2003

Anyone worked with WMI much?

code:
$m = Get-WmiObject Win32_OperatingSystem
$m.FreePhysicalMemory

#dostuff ................

$m.FreePhysicalMemory

Free physical memory never updates unless I create a new object:

code:
$m = Get-WmiObject Win32_OperatingSystem
$m.FreePhysicalMemory

#dostuff ................


$m = Get-WmiObject Win32_OperatingSystem
$m.FreePhysicalMemory

Any way to call an update on the $m object instead? I have this same problem working with Win32_Service - I can stop a service, do a $service.Status and itll show running until I do the new object.

Mildly frustrating to work with certain WMI objects this way; is it the only one?


edit:
Of course, this works - but gently caress me if it aint a bit clunky:
code:
(Get-WmiObject Win32_OperatingSystem).FreePhysicalMemory

Walked fucked around with this message at 21:51 on Oct 29, 2010

CISADMIN PRIVILEGE
Aug 15, 2004

optimized multichannel
campaigns to drive
demand and increase
brand engagement
across web, mobile,
and social touchpoints,
bitch!
:yaycloud::smithcloud:
I'm trying to create a script which when invoked will recursively delete all files older than 5 days from a folder. Then recursively copy all files < 3 days old into that folder. Unfortunately while I can list an appropriate set of files with:

code:
$DateToCompare = (Get-date).AddDays(-3)
Get-Childitem –recurse | where-object {$_.lastwritetime –gt $DateToCompare}
I can't seem to get it to copy the same set of files I've tried a few variations on the following. It doesn't look like I'm properly referencing the objects for the items, but I'm not quite sure of the syntax to do what I want with the files.

code:
$DateToCompare = (Get-date).AddDays(-3)
copy-item c:\users\xxx\source c:\users\xxx\powershelltest -recurse | where-object {$_.creationtime –gt $DateToCompare} 

adaz
Mar 7, 2009

Walked posted:

wmi stuff

well $m.psbase.Get() will refresh those values if that is what you're looking for? OTherwise you can check out the system.diagnostics namespace, might have more of what you're looking for.

adaz
Mar 7, 2009

bob arctor posted:

I'm trying to create a script which when invoked will recursively delete all files older than 5 days from a folder. Then recursively copy all files < 3 days old into that folder. Unfortunately while I can list an appropriate set of files with:

code:
$DateToCompare = (Get-date).AddDays(-3)
Get-Childitem –recurse | where-object {$_.lastwritetime –gt $DateToCompare}
I can't seem to get it to copy the same set of files I've tried a few variations on the following. It doesn't look like I'm properly referencing the objects for the items, but I'm not quite sure of the syntax to do what I want with the files.

code:
$DateToCompare = (Get-date).AddDays(-3)
copy-item c:\users\xxx\source c:\users\xxx\powershelltest -recurse | where-object {$_.creationtime –gt $DateToCompare} 

E: first solution wouldn't work, only copied directories. THIS will work though.

code:
$DateToCompare = (Get-date).AddDays(-3)
get-childitem -path c:\users\xxx\source -recurse | copy-item -destination c:\users\xxx\powershelltest -filter {$_.creationtime -gt $datetocompare}

adaz fucked around with this message at 06:06 on Oct 30, 2010

Walked
Apr 14, 2003

adaz posted:

well $m.psbase.Get() will refresh those values if that is what you're looking for? OTherwise you can check out the system.diagnostics namespace, might have more of what you're looking for.

$m.psbase.get() did exactly what I wanted. Thank you.

I appreciate it; though I'll poke around with system.diagnostics - the WMI Objects allow my to connect to remote hosts very easily which is really, really handy.

Muslim Wookie
Jul 6, 2005
I have what I would guess to be an ultra-easy Powershell question that I'm just slammed for time on so I'm asking instead of trying to work it out myself...

I need to get a count of all sub-subdirectories in a folder and no files. I.e.

C:\Temp
-> Folder1
--> FolderA
--> FolderB
--> FolderC
-> Folder2
-> Folder3
--> FolderD

Total would be 4, it's counting FolderA, FolderB, FolderC, and FolderD while ignoring the folders above it and any files encountered.

Victor
Jun 18, 2004
Get-ChildItem -Recurse | ? { $_ -is [System.IO.DirectoryInfo] }

Muslim Wookie
Jul 6, 2005

Victor posted:

Get-ChildItem -Recurse | ? { $_ -is [System.IO.DirectoryInfo] }

Thanks, I'd almost gotten this far myself but this doesn't stop after one level of sub-directories. It recurses all the way down the tree. Is it possible to tell Powershell how many levels of recursion to do?

Either way, this will help heaps in that while I can't necessarily | measure-object the output is still useful.

Victor
Jun 18, 2004
This should do ya:
code:
Get-ChildItem                              | 
    ? { $_ -is [System.IO.DirectoryInfo] } | 
    % { Get-ChildItem $_ }                 | 
    ? { $_ -is [System.IO.DirectoryInfo] }
(To those better at PowerShell, I'd love to know if there are better ways to do the above.)

Muslim Wookie
Jul 6, 2005

Victor posted:

This should do ya:
code:
Get-ChildItem                              | 
    ? { $_ -is [System.IO.DirectoryInfo] } | 
    % { Get-ChildItem $_ }                 | 
    ? { $_ -is [System.IO.DirectoryInfo] }
(To those better at PowerShell, I'd love to know if there are better ways to do the above.)

Mate you are a lifesaver, that's done the trick nicely. If you have some time, would you mind explaining what each piece is doing, it seems like something I should have been able to quickly get my head around but for some reason haven't.

I did also find this bit of code elsewhere:

code:
1.function Get-ChildItemRecurse {
2.<#
3..Synopsis
4.  Does a recursive search through a PSDrive up to n levels.
5..Description
6.  Does a recursive directory search, allowing the user to specify the number of
7.  levels to search.
8..Parameter path
9.  The starting path.
10..Parameter fileglob
11.  (optional) the search string for matching files
12..Parameter levels
13.  The numer of levels to recurse.
14..Example
15.  # Get up to three levels of files
16.  PS> Get-ChildItemRecurse *.* -levels 3
17. 
18..Notes
19.  NAME:      Get-ChildItemRecurse
20.  AUTHOR:    tojo2000
21.#Requires -Version 2.0
22.#>
23.  Param([Parameter(Mandatory = $true,
24.                   ValueFromPipeLine = $false,
25.                   Position = 0)]
26.        [string]$path = '.',
27.        [Parameter(Mandatory = $false,
28.                   Position = 1,
29.                   ValueFromPipeLine = $false)]
30.        [string]$fileglob = '*.*',
31.        [Parameter(Mandatory = $false,
32.                   Position = 2,
33.                   ValueFromPipeLine = $false)]
34.        [int]$levels = 0)
35. 
36.  if (-not (Test-Path $path)) {
37.    Write-Error "$path is an invalid path."
38.    return $false
39.  }
40. 
41.  $files = @(Get-ChildItem $path)
42. 
43.  foreach ($file in $files) {
44.    if ($file -like $fileglob) {
45.      Write-Output $file
46.    }
47. 
48.  #if ($file.GetType().FullName -eq 'System.IO.DirectoryInfo') {
49.    if ($file.PSIsContainer) {
50.      if ($levels -gt 0) {
51.          Get-ChildItemRecurse -path $file.FullName `
52.                               -fileglob $fileglob `
53.                               -levels ($levels - 1)
54.      }
55.    }
56.  }
57.}

adaz
Mar 7, 2009

Victor posted:

This should do ya:
code:
Get-ChildItem                              | 
    ? { $_ -is [System.IO.DirectoryInfo] } | 
    % { Get-ChildItem $_ }                 | 
    ? { $_ -is [System.IO.DirectoryInfo] }
(To those better at PowerShell, I'd love to know if there are better ways to do the above.)

e2: something like this but it's not easier than yours and I can't get the drat thing to work right anyways so you win. I can't really think of an easier way to do it than how you did to be honest.

quote:

get-childitem "C:\temp" | where-object {$_.psiscontainer -eq $true} | select-object $_.fullname | get-childitem | where-object {$_.psiscontainer -eq $true}

adaz fucked around with this message at 17:29 on Nov 1, 2010

Victor
Jun 18, 2004

marketingman posted:

Mate you are a lifesaver, that's done the trick nicely. If you have some time, would you mind explaining what each piece is doing, it seems like something I should have been able to quickly get my head around but for some reason haven't.
code:
Get-ChildItem                              | # get all items in the current directory
    ? { $_ -is [System.IO.DirectoryInfo] } | # filter to the ones that are directories
    % { Get-ChildItem $_ }                 | # get all items under each of those directories
    ? { $_ -is [System.IO.DirectoryInfo] }   # again, filter to all the items that are directories
Remember:
? is the same as Where-Object
% is the same as Foreach-Object

Cronus
Mar 9, 2003

Hello beautiful.
This...is gonna get gross.
What are everyone's opinions on the Quest AD Cmdlets? Link

I know you can just use .NET to access directory services, but for some of the projects I've been building I find it amazing for user manipulation. I wrote a script recently to query AD and check password expiration dates, and then send custom emails based on certain thresholds for remote users who basically just check webmail, and output HTML reports (which management loves).
I've gone all in on Posh unless it really is just much easier doing a task with the standard CLI, but this is rare.

I got a script request that I'm pretty sure is uncharted territory and wondering if anyone has ideas. Google and MSDN have nothing.

Essentially one of our engineers lost a user's offline cache during a server migration this weekend, so they want me to make a script that will auto-extract data in said cache to a temp location just to play it safe. Now in WinXP this is easy: just use the CSCCMD utility. But in Vista/7, these tools simply don't work. The only other option seems to be using the WMI provider for Offline Files or call the API directly. I am by no means an expert coder but I have dabbled in PHP/Perl and SQL, this may be above my skill level.

I essentially want to know if I can enumerate the offline file listing and then build the code to extract said files from the CSC database to another folder, in case this is an issue down the line.

The WMI providers, etc can be found here. But reading through it just seems to show me how to list but not necessarily manipulate the files in there. Anyone got any ideas or breadcrumbs on where I should go with this?

adaz
Mar 7, 2009

I don't mind quest's AD cmd-lets but I find it easier to just write my own, also in a large environment you can't necessarily be sure those cmdlets will be installed on every computer. They definitely save time though if you're only worried about working from your own pc, or that your scripts will always have access to them.

As far as your problem, I haven't ever really touched offline files and I'm not running windows 7 at the moment so I can test anything. However, does this scripting guys article on working with them help you at all as far as breadcrumbs? Hopefully someone else has worked with them before can help you.

http://blogs.technet.com/b/heyscriptingguy/archive/2009/06/02/how-can-i-work-with-the-offline-files-feature-in-windows.aspx

adaz fucked around with this message at 22:19 on Nov 10, 2010

Z-Bo
Jul 2, 2005
more like z-butt
I have an idea and want to throw it out there to get feedback from goons.

What do you all think about the idea of writing a PowerShell snap-in that can automatically report on stuff like licenses and installed software on a machine?

It seems with WMI we can pretty much automate what a free solution like LANSweeper does, without requiring a client-side executable for it to work. PowerShell would simply use the new remoting features in 2.0 and you'd automatically get ops reports, centralized into perhaps a SQL Server database so we could use SQL Server Reporting Services for building dashboards like LANSweeper has.

adaz
Mar 7, 2009

Z-Bo posted:

I have an idea and want to throw it out there to get feedback from goons.

What do you all think about the idea of writing a PowerShell snap-in that can automatically report on stuff like licenses and installed software on a machine?

It seems with WMI we can pretty much automate what a free solution like LANSweeper does, without requiring a client-side executable for it to work. PowerShell would simply use the new remoting features in 2.0 and you'd automatically get ops reports, centralized into perhaps a SQL Server database so we could use SQL Server Reporting Services for building dashboards like LANSweeper has.

That would most definitely work and be free, not sure how fast it would be but it's something that would probably only need to run once a day.

Sagacity
May 2, 2003
Hopefully my epitaph will be funnier than my custom title.
I was trying to use PowerShell to automate deployments. One of the things it had to do was recursively delete a folder and its contents. It randomly would leave dangling files.

The reason? Well, a bug in PowerShell of course!

It's known to Microsoft for quite some time and they even explicitly mention it in the help for Remove-Item!

I must say the rest of PowerShell works quite well, but I'm amazed by the fact that such a bug is in there and acknowledged (but not fixed) like that.

skipdogg
Nov 29, 2004
Resident SRT-4 Expert

Adaz and co.

No idea if you still check this thread, but if you could help me out I would appreciate it.

A task I would like to accomplish is this. I have 300 Windows XP computers and I need to change the MTU setting on the ethernet adapter to a value of 1300. I'm guessing powershell can easily help me do this.

I suck at programming. I know what steps I need to accomplish, but stringing them together just blows my mind.

I have a .csv file of the computer names available.

I know I need to do the following, but translating it to code is just :psyboom: for me. To give you an idea of what I need to accomplish

1: I need to import a computer name to perform additional actions on
2: I need to read the HKEY_LOCAL_MACHINE\SOFTWARE\Microsoft\Windows NT\CurrentVersion\NetworkCards key on the computer in step 1. There's a sub entry with a random number, then under that there is a string value called 'ServiceName' with a value of some random id.

example
code:
[HKEY_LOCAL_MACHINE\SOFTWARE\Microsoft\Windows NT\CurrentVersion\NetworkCards\8]
"ServiceName"="{76373BA6-7201-4780-BEC2-206EFE6F3617}"
"Description"="Intel(R) 82567LM-3 Gigabit Network Connection"
3: I need to somehow single out the value of "ServiceName" into a variable and then have the script connect to this key

HKEY_LOCAL_MACHINE\SYSTEM\CurrentControlSet\services\Tcpip\Parameters\Interfaces\%step2variable%

and after it connects to that key, I need it to write a new value under that key of "MTU" with data of '1300'


A less elegant but just as effective way would be just just apply the MTU of 1300 to every adapter under HKLM\SYSTEM\CurrentControlSet\services\Tcpip\Parameters\Interfaces\ as the machines only have 1 adapter anyway. Not sure how I could do that either.

Any pointers or suggestions to get my going in the right direction would be super helpful.

Cronus
Mar 9, 2003

Hello beautiful.
This...is gonna get gross.
You're probably better off not doing it via the registry (which I find really cumbersome in PShell) and just use WMI.
code:
$data = gwmi -class Win32_NetworkAdapterConfiguration | where {$_.IPEnabled -match "True"}
This will get you the list of NICs that currently have an IP address and save the object into a variable, which is probably going to just be one result. For WinXP there is a SetMTU method you can apply, so...
code:
$data.SetMTU(1300)
If there are multiple results, you'll want to do a foreach-object loop to set the value.

That should do it. Documentation on the method is here in case you are curious. Seems like this method isn't supported in Windows 2003 but nothing about XP. I would definitely test the method on a VM first, the status codes are in that document too so you can see if it worked or not.

Also if you're ever curious about what methods are available to you, just pipe your object/command to:
code:
get-member -membertype method
and you'll get the goodies.

Hope this helps!

adaz
Mar 7, 2009

Cronus is quite right that's the best method of doing it. To load the values from your CSV file, assuming you have the computer common names in a column called computers, use something like this:

code:
$computers = import-CSV C:\temp\locationofcsv.csv
foreach($computer in $computerS) {
 # do what Cronus said
$data = gwmi -class)
}
When iterating through the CSV with ForEach remember you accessing each "row" in the CSV, then use the column header to access the data contained in each column in that row. You could add some fancy stuff to ping them first if they might be down, up to you!

CuddleChunks
Sep 18, 2004

This is a very handy thread, thanks for making it! I just started learning Powershell a bit to automate some tasks at home. After reading the OP and some of the other examples in here I'm going to make some modifications to my script and then post it for all to laugh at.

Thanks for the cool info, powershell has been lots of fun to work with.

CuddleChunks
Sep 18, 2004

How do I properly use command-line arguments in a function? I would like to be fancy and do some error checking and whining at the user if they fail to give me the right parameters.
code:
Example command line: 
.\scriptname "c:\test" "d:\output"

Inside the file: 
function Fun {
	param(
	[parameter(position=0,Mandatory=$true,HelpMessage ="Need a Source Directory" )] $srcdir,
	[parameter(position=1,Mandatory=$true,HelpMessage="Need a Destination Directory")] $dstdir
	)
     write-host "srcdir:" $srcdir "dstdir:" $dstdir
}
$s = $args[0]
$d = $args[1]
Fun $s, $d
What happens every time is that the first parameter gets both of the command-line arguments separated by a space so the variable $srcdir is full of the complete command-line.

I've thrashed at this for a bit and it just doesn't seem to work like I expect. I've tried it like in the code above, calling the $arg[0] and $arg[1] right in the function call and still it acts stupid.

adaz
Mar 7, 2009

You'd do something like this

code:
# This is my code.ps1
param([parameter(position=0,Mandatory=$true,HelpMessage ="Need a Source Directory" )]$sourceDir,
[parameter(position=1,Mandatory=$true,HelpMessage="Need a Destination Directory")]$DestDir) 

function Fun {
	param([string]$srcdir,$dstdir)
        write-host "srcdir: $srcdir dstdir: $dstdir"
}

Fun -srcdir $sourceDir -dstdir $destDir

# end of file

.\blah.ps1 -sourceDir "c:\blah" -DestDir "D:\blah.txt"
The param names are what are called with dashes, and tab completion works fine on them as well as long as you explicitly declare them in param. You can then do the same thing inside a script to call a function, using -paramname



adaz fucked around with this message at 20:36 on Apr 8, 2011

CuddleChunks
Sep 18, 2004

Okay, that makes sense now, thank you.

I was confusing where to use the parameter list and should have put that at the head of the script since I want to manipulate *those* parameters first before we ever get to the function declaration.

CuddleChunks
Sep 18, 2004

Very cool! So here we go, this is designed to work with the Windows version of Handbrake to convert video files that are on the Approved list over to iPod-ready files using the m4v file extension. Feed the script a source and destination directory and then one-by-one it will chew through your files in the source directory and convert them over using the designated Handbrake profile.

The Options section of code makes it easy to point the program at your install of Handbrake and change what profile you use or to update the list of approved files.

I hope this is handy for someone else because it's been a lifesaver for me. I've got tons of shows in AVI format that I want to watch on my iPod but don't want to point and click my way through each series. Now I just let my machine grind away all night on the files without me having to bother with a GUI.

Extra features:
- builds the destination directory if it doesn't exist.
- checks for existing files and skips them so you can safely run this against a directory that you download new content into and it will only process new files.
- spawns one process at a time and documents what file it's working on so that you can see what it's doing in the Powershell window.

code:
# Convert files in srcdir to designated output directory for iPod viewing using Handbrake
# Built 3/29/2011 by DCB  Modified 4/8/2011  Contact: [email]cuddlechunks@gmail.com[/email]
# Parameters: Source Directory, Output Directory
#
# Verify that we got the source and destination directories off the command line. 
param(
      [parameter(position=0,Mandatory=$true,HelpMessage ="Need a Source Directory" )] $srcdir,
      [parameter(position=1,Mandatory=$true,HelpMessage="Need a Destination Directory")] $dstdir
      )

###### OPTIONS #######
# Location of Handbrake CLI 
$HBloc = '"c:\program files\handbrake\handbrakecli"' 
# Name of Preset to use
$presetname = "iPhone & iPod Touch"
# File extensions for valid filetypes
$validfiles = ".avi", ".m4v", ".mp4", ".mkv"

function HBConvert 
{
	# Get input files, filter by approved extensions
	$srcfiles = dir -Path $srcdir | where {$_.psIsContainer -ne $true} | where {$validfiles -contains $_.extension}

	# Check for existence of output dir, create if missing
	if (Test-Path $dstdir) 
	   { echo "Output Path Exists" }
	else { echo "Making directory $dstdir"
	       mkdir $dstdir}

	# Process file list
	foreach ($movie in $srcfiles) 
	{
		$fname = [System.IO.Path]::GetFileNameWithoutExtension($movie)
		$fall = $dstdir + "\" + $fname + ".m4v"

		# Check for existence of output file, skip next step if exists
		if (Test-Path $fall)  
		{
		     echo "Skipping $fall, already exists."
		}
		else 
		{ # Process the files
		      $HBparams= " -i " + '"' + $movie.FullName + '"' + " -o " + '"' + $fall + '" --preset ' + $presetname
		      echo "Now converting: $fname"
		      [system.diagnostics.process]::start($HBloc,$HBparams).WaitForExit() 
		}
	} # end foreach
} # end HBConvert

HBConvert
This was a lot of fun to build, even with some of the frustrations that come with not having a formal book or training program for the language. Thanks to this thread and google I was able to puzzle out the above over the course of a few pleasant evenings. I've been using the script to chew through my files for a couple weeks now and hope it helps someone else.

adaz
Mar 7, 2009

That's a pretty nifty little way to use powershell. Really shows the flexibility and how "easy" it is to build scripts with it, even scripts that might hook into another application.


One minor thing is Echo is actually an Alias for Write-Output. I only say this because, technically, you should use Write-Host for outputting things to the console as Write-Output puts things in the pipeline and can lead to weird things happening if you're accepting pipeline input. Also, write-host has built in options for color formatting and all that.

Also a random thing but this:

code:
$fall = $dstdir + "\" + $fname + ".m4v"
in powershell can also be written as below, some people like it your way, some people write it the other way - it's more personal preference than anything. Enclosing everything in "" makes it a bit more easier to read I think in code, but some people are so used to escape characters they hate it.

code:
$fall = "$dstdir\$fname.m4v"

adaz fucked around with this message at 22:47 on Apr 8, 2011

CuddleChunks
Sep 18, 2004

adaz posted:

code:
$fall = "$dstdir\$fname.m4v"
:3: Awww, thanks powershell. Yeah, I'm coming from a background in procedural languages where it wouldn't occur to me to glob it all together like you have above. Still, I'll keep that in mind for the future. Also, I'll tweak the Echo statements since they worked during testing but I don't want to have something weird happen down the line.

Thanks for the tips!

adaz
Mar 7, 2009

It's probably nothing to worry about, just a more general FYI.

The string expanded/quoting thing is pretty nifty but one thing that will trip people up is sometimes it'll blow up if you're trying to access the subproperties of an object.

Like say I have a directory entry (a .NET way of accessing a entry in Active Directory) and want to put the postal code in a string. Look at this fun behavior:


So, what you want to do is enclose that in $() (a sub-expression) to cause powershell to explode it first before it takes the rest of the string in:

Serfer
Mar 10, 2003

The piss tape is real



Ok, I've got a problem. I have a datatable (returned data from an SQL query), and I need to count up the items that have the same of a single element.
So there are the following fields:

HostName DisplayName EntryDate Usages

and I need to count up all the items that have the same displayname. I understand essentially how it would be done (probably easiest would be a dynamic variable with the same name, that just gets incremented), but I'm unfamiliar with how these are actually done in Powershell.

wwb
Aug 17, 2004

Easiest way would be to have sql server do it actually- "SELECT COUNT(*) as ItemCount, HostName, DisplayName, EntryDate, Usages GROUP BY HostName" is a step in the right direction.

Adbot
ADBOT LOVES YOU

Serfer
Mar 10, 2003

The piss tape is real



wwb posted:

Easiest way would be to have sql server do it actually- "SELECT COUNT(*) as ItemCount, HostName, DisplayName, EntryDate, Usages GROUP BY HostName" is a step in the right direction.
Well, it's not quite that simple, but that's essentially what I ended up doing.

  • 1
  • 2
  • 3
  • 4
  • 5
  • Post
  • Reply