Register a SA Forums Account here!
JOINING THE SA FORUMS WILL REMOVE THIS BIG AD, THE ANNOYING UNDERLINED ADS, AND STUPID INTERSTITIAL ADS!!!

You can: log in, read the tech support FAQ, or request your lost password. This dumb message (and those ads) will appear on every screen until you register! Get rid of this crap by registering your own SA Forums Account and joining roughly 150,000 Goons, for the one-time price of $9.95! We charge money because it costs us money per month for bills, and since we don't believe in showing ads to our users, we try to make the money back through forum registrations.
 
  • Post
  • Reply
Scaramouche
Mar 26, 2001

SPACE FACE! SPACE FACE!

I feel like a big hypocrite because I always resolve to read this thread and learn everything in it, and then I never do, and now I have a quick and dirty problem that I would probably know how to solve had I just read the whole drat thread originally.

I've got a list of about 26,000 file names from a file system, and I've got a database dump of about 29,000 file names. I need to see which file names from the file system aren't present in the database dump (obviously there's about 3,000) so I can see which database rows are missing images. First try was to just paste them both into LibreOffice Calc and then do a CountIf(range) operation, but LibreOffice doesn't apparently like thousands of rows, because it's been 'not responding' for about 30 minutes now. I don't have office on this machine unfortunately.

Is there a way I can use PS to do this elegantly? I could do it in MSDOS batch (but don't want to) by just putting 29,000 of these into a batch file:
code:
dir file1.jpg
dir file2.jpg
dir file3.jpg
...
dir file 29701.jpg
And then piping the results of that to file. But this seems inelegant and wasteful to me.

Is there a better powershell way? I'm just pulling this out of my rear end but something like:
FindMissing (29000Files.txt) (Directory with 26000 files path)

would be great.

EDIT-
It looks like Test-Path might have the functionality, but I can't figure out how to pass it a list of files as the criteria, and I can't figure out how to make it spit out the missing file if it returns FALSE.

EDIT2-
Hmm you'd think this would work but it doesn't:
code:
Get-Content FileList.txt | ForEach-Object(test-path $_)
EDIT3-
Adoy. It was because FileList.txt was unicode; saved it as DOS and ended up doing this:
code:
$FileList = Get-Content FileList.txt
foreach ($FileName in $FileList) { if (!(test-path $FileName)) {Out-File FailLog.txt -append -inputobject $FileName} }
It took me longer to figure out how the negation(!) worked than anything else.

Scaramouche fucked around with this message at 04:32 on Nov 3, 2011

Adbot
ADBOT LOVES YOU

Scaramouche
Mar 26, 2001

SPACE FACE! SPACE FACE!

Got some more goodies for you PS gurus. I'm trying to de-dupe that enormous file list mentioned previous (29,000 image files). I'm using this guy's script to find them:
http://blog.codeassassin.com/2007/10/13/find-duplicate-files-with-powershell/

Seems pretty good (goes by size and then md5), and the counts are really handy. Some of the files are duplicated over 1000 times. The problem I'm having though, is it's good to know the counts, but what I really need are all the duplicated file names themselves. Unfortunately the output of the script looks like this:
code:
Count   Name                  Groups
95	137 255 159 230 40 53 {127769_5.jpg, 127770_4.jpg...}
Where the ... is where the file list gets truncated. I >think< this could be controlled with Format-Table, but I'm not sure how I could get it to work, since in some cases (over 1000 file names) I'm probably going to run up against some kind of internal limit in powershell.

My other option is tweaking the script itself, since I don't actually need file counts and the 'name' column, all I would need is something like this:
code:
ID   FileName
1    127769_5.jpg
1    127770_4.jpg
1    127771_3.jpg
2    131110_1.jpg
2    131111_2.jpg
2    131112_3.jpg
etc.
Where the 'ID' just denotes that 'yes all of these files are the same'. The second option is actually more useful to me, because then I could relatively quickly turn this list into SQL, instead of having to harvest out several comma-delimited lists like the Format-Report solution does. My tweaking has not found success thus far though, resulting only in errors, returning no files, or returning that all files are identical. My questions are thus:
1. Do you guys think that I should be concentrating on Format-Report or just tweaking the script to get my desired output? Can Format-Report columns even go that large?
2. Anyone have any advice on how to get the desired output in the second example from the script linked above?

Thanks in advance for any help.

Scaramouche
Mar 26, 2001

SPACE FACE! SPACE FACE!

kampy posted:

I would just modify the script a bit, or perhaps just run something like:
code:
$dupes = .\unmodifiedscriptfromtheweb.ps1 c:\whatever\path
foreach ($dupe in $dupes)
{
	foreach ($d in $dupe.Group)
	{
		# Change $d.Name to $d.FullName for the full path.
		"{0} *{1}" -f $dupe.Name, $d.Name
	}
	write-host ""
}
You could also modify the script a bit so that it produces a prettier md5 output by modifying the Get-MD5 function to something like this:
code:
function Get-MD5([System.IO.FileInfo] $file = $(throw ‘Usage: Get-MD5 [System.IO.FileInfo]‘))
{
# This Get-MD5 function sourced from:
# [url]http://blogs.msdn.com/powershell/archive/2006/04/25/583225.aspx[/url]
$stream = $null;
$cryptoServiceProvider = [System.Security.Cryptography.MD5CryptoServiceProvider];
$hashAlgorithm = new-object $cryptoServiceProvider
$stream = $file.OpenRead();
$hashByteArray = $hashAlgorithm.ComputeHash($stream);
$stream.Close();

## We have to be sure that we close the file stream if any exceptions are thrown.
trap
{
if ($stream -ne $null) { $stream.Close(); }
break;
}

foreach ($byte in $hashByteArray)
{
	$returnme += $byte.ToString("x2")
}
return $returnme
}
Edit:
You could also do something like this:
code:
$output = ""
$dupes = .\unmodifiedscriptfromtheweb.ps1 c:\whatever\path
foreach ($dupe in $dupes)
{
	foreach ($d in $dupe.Group)
	{
		$whatwewant = @{md5=$dupe.Name; filename=$d.Name}
		[array] $output += new-object psobject -property $whatwewant
	}
}
$output

Sorry for the late reply, but thanks for the help. The project got backburnered for a while and we ended up just deleting all the files and redownloading and enforcing unique names on creation. But I'm going to keep this around regardless, thanks!

Scaramouche
Mar 26, 2001

SPACE FACE! SPACE FACE!

Hey guys, I'm back to leeching off your expertise without ever contributing anything once again. I've got a bunch of product images are 500 pixels tall by (x) pixels wide. What I want to do is identify files where (x)<500 and then I want to 'pad out' the image file with white pixels (equally on either side) so it ends up being 500 x 500.

I'm know I can do this in Image Majick but would rather not install it on a production server (hundreds of thousands of images are involved), so I figured I'd ask you guys if it's possible in PowerShell. I know from reading around it's possible to resize things using WIA (http://blogs.msdn.com/b/powershell/archive/2009/03/31/image-manipulation-in-powershell.aspx), but it looks a bit daunting and was wondering if anyone had any experience doing it. I also can't find reference to padding the image instead of resizing it.

Scaramouche
Mar 26, 2001

SPACE FACE! SPACE FACE!

Thanks for that. I actually have a .vbs script that can do it so I'm aware of the .NET solution, which goes exactly as you surmise. Create white 500x500 canvas, paste centered, and your uncle is bob. Was hoping to centralize around PowerShell and drop some of this array .vbs, .bat, .cmd files that have been following me around since the old days. I'm still pretty sure it's possible, just have to figure out how to access external libraries properly in PowerShell; I know the .NET syntax but am not sure how to reference it. Basically I want to be able to do:
code:
Using System.Drawing
But in PowerShell. Would it be something like this?
code:
[void]([reflection.assembly]::loadfile
("C:\Windows\Microsoft.NET\Framework\v2.0.50727\System.Drawing.dll"))
OR
[void]([reflection.assembly]::loadfile
("C:\Windows\Microsoft.NET\Framework\v2.0.50727\wiascr.dll"))
Sorry for the basic nature of the question but I've always been using the 'built in' functionality of PS until this point.

EDIT-ugh stupid question didn't notice your WIA call in the example code until I tried to run it just now. Embarrassing.

Scaramouche fucked around with this message at 01:40 on Mar 13, 2012

Scaramouche
Mar 26, 2001

SPACE FACE! SPACE FACE!

Thanks Adaz, think I got it mostly figured out. I just needed to get onto the right new-Object x.y.z syntax before I could start hacking around. After that it's trivial to re-word the .vbs script. If I can get it cleaned up I'll try to post it here.

Scaramouche
Mar 26, 2001

SPACE FACE! SPACE FACE!

Hey there 'Shellers (need a better name for PS enthusiasts), for once I'm not coming here with a naked cry for help. Instead this is more of an opinion question. I'm automating some FTP scripts and I'm tired of doing it in batch files so I'm going to PowerShell it. I'm probably going to do this (first answer):
http://stackoverflow.com/questions/1867385/upload-files-with-ftp-using-powershell

But thought I'd check in with you guys to see if there's a cooler way. Right now it's going to be one file (since it's always known and doesn't change) so I don't need to worry about for each or anything like that.

One question I did have, if I needed to extend this to SFTP/FTPS is it actually just as easy as changing the WebRequest.Methods+xxx call?

Scaramouche
Mar 26, 2001

SPACE FACE! SPACE FACE!

This is a theoretical question I guess, but in cases like the above where performance degrades after x number of files, would something DOS based be faster? e.g. http://stackoverflow.com/questions/51054/batch-file-to-delete-files-older-than-n-days

I had no idea PowerShell crapped out like that after a certain amount of files, though I'm usually working with at least 300k files when I'm using it.

Scaramouche
Mar 26, 2001

SPACE FACE! SPACE FACE!

Hola powershell companeros, another day, another powershell question. I'm making a powershell script to:
1. Log into an ftp (done)
2. Get a directory listing (done)
3. Find the last file in the directory listing, which is randomly named (failing hard)
4. Download said last file (not done but easy)

I'm doing all this using native objects because this thing has to be somewhat portable, otherwise I'd be using the great PSFTP client (http://gallery.technet.microsoft.com/scriptcenter/PowerShell-FTP-Client-db6fe0cb). Basically what I've done is returning a stream, and I'm boggled as to how to get the last line of it.

Here's what I've got so far, hacked together from a couple guys' FTP examples:
code:
{void} {System.Reflection.Assembly}::LoadWithPartialName("system.net")

$ftpserver = "ftp://adaz.com/Orders"
$ftpuser = "Jelmylicious"
$ftppassword = "passworddonotsteal"

$ftpconn = {system.net.ftpwebrequest} {system.net.webrequest}::create($ftpserver)
$ftpconn.Credentials = New-Object System.Net.NetworkCredential($ftpuser,$ftppassword)

$ftpconn.method = {system.net.WebRequestMethods+ftp}::listdirectorydetails
$ftpresponse = $ftpconn.getresponse()
$ftpstream = $ftpresponse.getresponsestream()

  $buffer = new-object System.Byte{} 1024 
  $encoding = new-object System.Text.AsciiEncoding 

  $outputBuffer = "" 
  $doMore = $false 

  do 
  { 
    start-sleep -m 1000 

    $doMore = $false 
    $ftpstream.ReadTimeout = 1000 

    do 
    { 
      try 
      { 
        $read = $ftpstream.Read($buffer, 0, 1024) 

        if($read -gt 0) 
        { 
          $doMore = $true 
          $outputBuffer += ($encoding.GetString($buffer, 0, $read)) 
        } 
      } catch { $doMore = $false; $read = 0 } 
    } while($read -gt 0) 
  } while($doMore)
My problems are twofold:
1. I got no idea how to do anything with this stream. I've tried my VB tricks with Seek, and split(\n) but I get gobbledygook,null, or errors all the time. I think there's something different with how PowerShell handles them.
2. Even after that I'll still have to isolate the file name itself. The output I'm looking at looks like:
code:
Date   Time      Size(bytes)   Filename
Ideally I guess I'd want to find EOF, go back one CR/LF, and then grab the string until the first TAB (assuming that's the separator used). I don't need you guys to write me a whole solution, but just give me either a way to do seek/etc. like I'm used to or (thinking outside the box) turn the listing into something I can do get-child type operations on.

Scaramouche
Mar 26, 2001

SPACE FACE! SPACE FACE!

Thanks mang, sorry for the late reply I took a little 'computer break' this weekend. I'll check that out.

Scaramouche
Mar 26, 2001

SPACE FACE! SPACE FACE!

Hay guys I'm back, this time with a lot more PowerShell under my belt and asking kind of an academic question. I've made a short little script that harvests tracking information from CSV files downloaded from an FTP server. checks the archive directory to see if the file already exists, and if it doesn't exist email the contents of the file somewhere and move it to the archive folder. It's basically (boring stuff cut out):

code:
$f1Path = "d:\blah\"
$f2Path = "d:\blah\archive\"

$f1 = Get-ChildItem -path $f1Path -Filter *.csv
$f2 = Get-ChildItem -path $f2Path -Filter *.csv

If ( $f1 -and $f2 ) {
	$f3 = Compare-Object -ReferenceObject $f1 -DifferenceObject $f2 | Where-Object { $_.SideIndicator -eq "<="}
}


if ($f3) {
	#build email body
	$BodyText = foreach ($a in $f3) { Import-CSV ($f1Path + $a.InputObject) -header("HEADERS") 
        | Select-Object HEADERS | convertto-html }
	
	#send email etc.
	$emailMessage.Body = $BodyText
	 
	#copy files to archive
	foreach ($a in $f3) { copy-item -Path ($f1Path + $a.InputObject) -Destination $f2Path }
}
So I was showing the recipient of this email how things would work, and as an example I used
code:
$uo = Import-CSV tracktest.csv
$uo

Column1     : Value
Column2     : Value
Column3     : Value
Column4     : Value
etc.
And apparently, they freakin love the default output to screen that occurs as above (e.g. all columns basically as rows) and I can't seem to figure out how to do it easily in PowerShell. All the transpose code I've found is messy and... messy. Is there an easy way to have the CSV object output to string as a vertical row of values in one column instead of Column1,Column2,Column3?

Scaramouche
Mar 26, 2001

SPACE FACE! SPACE FACE!

Titan Coeus posted:

Try this:

code:
Import-CSV tracktest.csv | Format-List | Out-String

Well check that poo poo out. I had already tried that (realizing that Out-* was the only thing I could use) but it didn't work. But then I got the brainwave of setting the email HTML to $false, and sure enough it came through. Makes me wonder how much the HTML aspect was screwing up my other tries at it, and if it would have worked all along with HTML if I just thought to put '<PRE>' in front of it. Thanks Titan Coeus, you're worthy of stealing a pie from Estelle Getty.

Scaramouche
Mar 26, 2001

SPACE FACE! SPACE FACE!

This should work but it doesn't, and the maddening thing is I've done this before.

I'm running this dedupe finder:
http://poshcode.org/4008

I want to pipe the output to a text file; I've done this in the past, and in fact after I make this post I'm going to click the 'find other posts by this user' button to see if I figured it out here.

When I try this:
code:
.\finddupe.ps1 *.jpg >> fff.txt
I get a file called fff.txt that is zero bytes.

When I try this:
code:
.\finddupe.ps1 *.jpg | Out-File fff.txt
I get a file called fff.txt that is 2 bytes but still empty.

When I try this:
code:
.\finddupe.ps1 *.jpg | tee fff.txt
I don't get anything, the file doesn't even get created. The output of the script above is using Write-Host, which >> should be hijacking based on everything I read. e.g., the summary output:
code:
write-host "Number of Files checked: $($files.count)."
write-host "Number of duplicate files: $m."
Write-host "$($hashfiles) files hashed."
Write-Host "$($hashsize) bytes hashed."
write-host "$dupebytes duplicates bytes."
What am I missing?

Scaramouche
Mar 26, 2001

SPACE FACE! SPACE FACE!

Hmm, thanks guys. The weird thing is, the date-time stamp on the script is from when I did this last, and I was going over 500,000 files, so I must have figured out a way to get that data without changing the actual script. After doing a bit of reading it makes sense that this won't work since write-host is meant for screen only. I guess my memory was playing tricks on me, which is probably the worst thing to rely on. You probably spend more time trying to do something "because I remembered it was easy last time" than you do trying it the right way.

Scaramouche
Mar 26, 2001

SPACE FACE! SPACE FACE!

CLAM DOWN posted:

Hey guys, I have to transfer a large number (>15,000,000) of small files totaling about 2TB from a 2003 to 2008 R2 server, maintaining folder structure and NTFS ACLs. Should I stick with robocopy, or would Copy-Item suit me fine here?

Not powershell necessarily but doesn't winrar preserve permissions? Sounds like it'd be easier to copy one big file than lots of little ones.

Scaramouche
Mar 26, 2001

SPACE FACE! SPACE FACE!

Yeah task scheduler is a big fuckup in all respects really, paths, quotation marks, permissions, all of it. It's amazing how they tried to duplicate the functionality of chron and make it even more obtuse to use.

Scaramouche
Mar 26, 2001

SPACE FACE! SPACE FACE!

Karthe posted:

I'm trying to write a quick Powershell batch script that calls a couple of other Powershell scripts that I've written. These other scripts output Japanese characters out to the console, and while console output looks fine when I run the script files by themselves, when I run them via my batch script all of the Japanese characters are replaced by question marks. How can I preserve UTF-8 output when I'm calling my scripts?

Here's what I have so far:

That's a weird one, this might help:
http://stackoverflow.com/questions/22349139/utf-8-output-from-powershell

According to one of those guys, it's actually a bug that it doesn't work in UTF16 by default.

Scaramouche
Mar 26, 2001

SPACE FACE! SPACE FACE!

myron cope posted:

Is it possible to schedule a powershell task that runs without launching a window? I just want it to do its thing, no window needed

What OS? I have a bunch of Scheduled Tasks on Windows Server 2003 (oh god don't remind about EOL) that just run in the background. With these settings:
Account: (some specific service account)
Run whether user is logged on or not
Run with highest privileges
Action: Start a program
PowerShell
.\MyScript.ps1
c:\jobs\

Scaramouche
Mar 26, 2001

SPACE FACE! SPACE FACE!

myron cope posted:

It's a Windows 8.1 machine. It's not a huge deal, but I don't need to see the output so if it's possible to hide I would.

We do have another pc that would be better anyway, maybe I should just move it to that one.

I just tried the same setup on Windows 7 and I think Braintist is right; there was no popup and I didn't bother messing with a window style.

Scaramouche
Mar 26, 2001

SPACE FACE! SPACE FACE!

Might as well post this here. I'm hoping to automate grabbing some shopify product information via their API and dump/display it somewhere so it can be compared against a database. This is mostly half-assed together from other people's scripts, and the first part works (connecting/retrieving) and what errors out is actually using/displaying what comes down in any usable fashion:
code:
$Uri = "https://blahdeblah.myshopify.com/admin/products.json"
$apikey = "blah de blah"
$password = "blah de blah"
$headers = @{"Authorization" = "Basic "+[System.Convert]::ToBase64String([System.Text.Encoding]::UTF8.GetBytes($apikey+":"+$password))}

write-output Invoke-RestMethod -Uri $uri -ContentType "application/json" -Method Get -Headers $headers -UseBasicParsing | ConvertFrom-Json
so everything works until the last line; if I just assign the output of Invoke-RestMethod I get a truncated string of stuff (which is accurate/valid), so I was hoping to parse the JSON in powershell itself. I'm only interested in the two values of sku and fulfillment service.

The JSON is 'valid' in the sense that I normally take this output and manually parse it in MSSQL using JSON_VALUE/OPENJSON, but powershell doesn't seem to like it saying:
code:
ConvertFrom-Json : Invalid JSON primitive: Invoke-RestMethod.
At C:\ProdCheck.ps1:6 char:106
+ ... cation/json" -Method Get -Headers $headers | ConvertFrom-Json
+                                                  ~~~~~~~~~~~~~~~~
    + CategoryInfo          : NotSpecified: (:) [ConvertFrom-Json], ArgumentException
    + FullyQualifiedErrorId : System.ArgumentException,Microsoft.PowerShell.Commands.ConvertFromJsonCommand
Googling around I'm seeing a lot of people running into the problem using get-content (to read files locally) and solving it by using -Raw and/or removing line breaks, but I was hoping to do this relatively "natively" since I know it's good JSON coming down the pipe from the Invoke-RestMethod?

Ideally I'd like the output to look like:
SKU : fulfillment_service
1 : yes
2: no

etc etc. if anyone wants to make it even fancier with say CSV-output or the like.

Scaramouche
Mar 26, 2001

SPACE FACE! SPACE FACE!

Hey! Thanks guys; The Fool especially, I quite liked seeing the iteration of outputs and seeing the differences between them.

I do believe the prevailing theory here is correct, I was trying to do some things but after the ConvertFrom-JSON when I should have been manipulating it natively as the PSO object from the Rest-Invoke method. I've got "an" output now, just need to tweak things so it is "the" output I want. I'm also doing some array within array stuff, so I foresee some for each coming up in my future.

Scaramouche
Mar 26, 2001

SPACE FACE! SPACE FACE!

Powershell is p cool. I haven't messed with it seriously in quite a while but in a couple hours I just set up:

- A scheduled task that runs every 10 mins
- Checks when last run successfully (using Get-ScheduledTask)
- Builds an SQL query string using date stamp acquired from above if greater than 10 minutes
- Connect to logging database and get info (using System.Data.SqlClient.SQLConnection)
- Parse results, build string out of exceptions
- If exception string <> "" log event to EventViewer (using Write-EventLog)
- If exception string <>"" email exceptions (using Net.Mail.SmtpClient and System.Net.Mail.MailMessage)
- If exception string <>"" post exceptions to dedicated Slack channel (using Invoke-RestMethod and hooks.slack.com)

It's real neat!

Scaramouche
Mar 26, 2001

SPACE FACE! SPACE FACE!

So here's a thing. I'm building JSON fulfillment requests from a CSV file. It's not exact, but it's roughly this:
code:
$csvJSONout = '[';
$csvTop = 'order_id','tracking_number','items_sku','items_quantity';
$csvIn = Import-Csv -Path .\test2.csv -Header $csvTop;
foreach ($p in $csvIn)
{
    $csvJSONout+=('{"order_id":' + $p.'order_id' + '"' + ',"packages": [{"tracking_number":' + '"' + $p.'tracking_number' + '"' + ',"items": [{ "sku": ' + '"' + $p.'items_sku' + '"' + ',"quantity": ' + $p.'items_quantity' + '}] }] },');
}
$csvJSONout += ']';
(I know about the bad comma on the last row)

But what I've been told now, is that the data isn't normalized very well. I might have cases where order_id is not unique, so like:
code:
+----------+----------+--------+-----+
| order_id | tracking |  sku   | qty |
+----------+----------+--------+-----+
|     1111 | XYZ      | BLEH-X |   1 |
|     1111 | XYZ      | BLEH-X |   1 |
|     1112 | ZYX      | BLAH11 |   4 |
|     1111 | XYZ      | BLIHZZ |   1 |
+----------+----------+--------+-----+
Ideally I'd like to be able to "roll up" both:
- All SKUs belonging to an order_id and
- Sum up Qty when a SKU is indicated for that order_id over multiple rows (e.g. BLEH-X above should be one entry for Qty 2)

I would in turn use this information to make one JSON request for the order, that keeps adding to tracking_number / items array for that order_id instead of multiple individual requests against the same order_id the current method would do.

I'm pretty sure I'd know how to do this in C#/VB, but am wondering if there's some wow neato thing like | Combine-ByKey "order_id" or something like that in PowerShell I'm not aware of.

If it helps, I can ensure that the CSV file is sorted by order_id, e.g. the next row would always be either the same order or a new one.

EDIT-And if you want to make cool pre/code tables like the above you can go here:
https://github.com/ozh/ascii-tables

EDIT EDIT-Hmm Group-Object -> Select-Object

Scaramouche fucked around with this message at 08:38 on Jun 26, 2019

Scaramouche
Mar 26, 2001

SPACE FACE! SPACE FACE!

Does AWS Powershell support you know, actually querying the table in some fashion? Or is it just a meta-organizer tool?

I'm part through hooking one up (I can't seem to be able to provide a sort key that satisfies "the provided key element does not match the schema") and I saw this on a related StackOverflow:
code:
So I'm running a scan of a DynamoDB table with ~500 records using the AWS CLI through Powershell, because the AWS Powershell Tools don't support DDB query/scan operations
So have I been wasting my time?

Scaramouche
Mar 26, 2001

SPACE FACE! SPACE FACE!

I sort of got it working using aws-cli but as mentioned the documentation is pretty dire for the powershell specific stuff. I got like 90% of the way there and will probably revisit it using native PS objects; the biggest stumbling block to me is that you must use the existing indexes as part of any query object, and it never really specifies where/how you should do that, so some random guess and test was involved. I'd probably have it working in PS, but one of the bright sparks who made the table (and didn't make any useful indexes) made one of the columns a reserved AWS keyword, and normal Powershell escaping doesn't work for getting around that.

I think there might be something to this guy's approach but I gave up fixing it halfway through as the impression I get is that it's no longer current:
https://www.powershellgallery.com/packages/domainAwsPowershellTools/1.0.2/Content/domainAwsPowershellTools.psm1

Adbot
ADBOT LOVES YOU

Scaramouche
Mar 26, 2001

SPACE FACE! SPACE FACE!

Had a fun thing that I nearly came into this thread to ask about. I've been using the type three letters then tab autocomplete in PowerShell ISE to run my various .ps1's. Turns out it will pick other extensions first if they have the same name... I was wondering why that .CSV file kept opening in excel every time I tried to run the script!

  • 1
  • 2
  • 3
  • 4
  • 5
  • Post
  • Reply