|
I feel like a big hypocrite because I always resolve to read this thread and learn everything in it, and then I never do, and now I have a quick and dirty problem that I would probably know how to solve had I just read the whole drat thread originally. I've got a list of about 26,000 file names from a file system, and I've got a database dump of about 29,000 file names. I need to see which file names from the file system aren't present in the database dump (obviously there's about 3,000) so I can see which database rows are missing images. First try was to just paste them both into LibreOffice Calc and then do a CountIf(range) operation, but LibreOffice doesn't apparently like thousands of rows, because it's been 'not responding' for about 30 minutes now. I don't have office on this machine unfortunately. Is there a way I can use PS to do this elegantly? I could do it in MSDOS batch (but don't want to) by just putting 29,000 of these into a batch file: code:
Is there a better powershell way? I'm just pulling this out of my rear end but something like: FindMissing (29000Files.txt) (Directory with 26000 files path) would be great. EDIT- It looks like Test-Path might have the functionality, but I can't figure out how to pass it a list of files as the criteria, and I can't figure out how to make it spit out the missing file if it returns FALSE. EDIT2- Hmm you'd think this would work but it doesn't: code:
Adoy. It was because FileList.txt was unicode; saved it as DOS and ended up doing this: code:
Scaramouche fucked around with this message at 04:32 on Nov 3, 2011 |
# ¿ Nov 3, 2011 02:30 |
|
|
# ¿ Apr 26, 2024 01:13 |
|
Got some more goodies for you PS gurus. I'm trying to de-dupe that enormous file list mentioned previous (29,000 image files). I'm using this guy's script to find them: http://blog.codeassassin.com/2007/10/13/find-duplicate-files-with-powershell/ Seems pretty good (goes by size and then md5), and the counts are really handy. Some of the files are duplicated over 1000 times. The problem I'm having though, is it's good to know the counts, but what I really need are all the duplicated file names themselves. Unfortunately the output of the script looks like this: code:
My other option is tweaking the script itself, since I don't actually need file counts and the 'name' column, all I would need is something like this: code:
1. Do you guys think that I should be concentrating on Format-Report or just tweaking the script to get my desired output? Can Format-Report columns even go that large? 2. Anyone have any advice on how to get the desired output in the second example from the script linked above? Thanks in advance for any help.
|
# ¿ Nov 6, 2011 03:27 |
|
kampy posted:I would just modify the script a bit, or perhaps just run something like: Sorry for the late reply, but thanks for the help. The project got backburnered for a while and we ended up just deleting all the files and redownloading and enforcing unique names on creation. But I'm going to keep this around regardless, thanks!
|
# ¿ Nov 23, 2011 19:55 |
|
Hey guys, I'm back to leeching off your expertise without ever contributing anything once again. I've got a bunch of product images are 500 pixels tall by (x) pixels wide. What I want to do is identify files where (x)<500 and then I want to 'pad out' the image file with white pixels (equally on either side) so it ends up being 500 x 500. I'm know I can do this in Image Majick but would rather not install it on a production server (hundreds of thousands of images are involved), so I figured I'd ask you guys if it's possible in PowerShell. I know from reading around it's possible to resize things using WIA (http://blogs.msdn.com/b/powershell/archive/2009/03/31/image-manipulation-in-powershell.aspx), but it looks a bit daunting and was wondering if anyone had any experience doing it. I also can't find reference to padding the image instead of resizing it.
|
# ¿ Mar 12, 2012 02:49 |
|
Thanks for that. I actually have a .vbs script that can do it so I'm aware of the .NET solution, which goes exactly as you surmise. Create white 500x500 canvas, paste centered, and your uncle is bob. Was hoping to centralize around PowerShell and drop some of this array .vbs, .bat, .cmd files that have been following me around since the old days. I'm still pretty sure it's possible, just have to figure out how to access external libraries properly in PowerShell; I know the .NET syntax but am not sure how to reference it. Basically I want to be able to do:code:
code:
EDIT-ugh stupid question didn't notice your WIA call in the example code until I tried to run it just now. Embarrassing. Scaramouche fucked around with this message at 01:40 on Mar 13, 2012 |
# ¿ Mar 12, 2012 21:33 |
|
Thanks Adaz, think I got it mostly figured out. I just needed to get onto the right new-Object x.y.z syntax before I could start hacking around. After that it's trivial to re-word the .vbs script. If I can get it cleaned up I'll try to post it here.
|
# ¿ Mar 13, 2012 21:47 |
|
Hey there 'Shellers (need a better name for PS enthusiasts), for once I'm not coming here with a naked cry for help. Instead this is more of an opinion question. I'm automating some FTP scripts and I'm tired of doing it in batch files so I'm going to PowerShell it. I'm probably going to do this (first answer): http://stackoverflow.com/questions/1867385/upload-files-with-ftp-using-powershell But thought I'd check in with you guys to see if there's a cooler way. Right now it's going to be one file (since it's always known and doesn't change) so I don't need to worry about for each or anything like that. One question I did have, if I needed to extend this to SFTP/FTPS is it actually just as easy as changing the WebRequest.Methods+xxx call?
|
# ¿ Apr 5, 2012 03:42 |
|
This is a theoretical question I guess, but in cases like the above where performance degrades after x number of files, would something DOS based be faster? e.g. http://stackoverflow.com/questions/51054/batch-file-to-delete-files-older-than-n-days I had no idea PowerShell crapped out like that after a certain amount of files, though I'm usually working with at least 300k files when I'm using it.
|
# ¿ Jun 9, 2012 02:21 |
|
Hola powershell companeros, another day, another powershell question. I'm making a powershell script to: 1. Log into an ftp (done) 2. Get a directory listing (done) 3. Find the last file in the directory listing, which is randomly named (failing hard) 4. Download said last file (not done but easy) I'm doing all this using native objects because this thing has to be somewhat portable, otherwise I'd be using the great PSFTP client (http://gallery.technet.microsoft.com/scriptcenter/PowerShell-FTP-Client-db6fe0cb). Basically what I've done is returning a stream, and I'm boggled as to how to get the last line of it. Here's what I've got so far, hacked together from a couple guys' FTP examples: code:
1. I got no idea how to do anything with this stream. I've tried my VB tricks with Seek, and split(\n) but I get gobbledygook,null, or errors all the time. I think there's something different with how PowerShell handles them. 2. Even after that I'll still have to isolate the file name itself. The output I'm looking at looks like: code:
|
# ¿ Jun 16, 2012 03:01 |
|
Thanks mang, sorry for the late reply I took a little 'computer break' this weekend. I'll check that out.
|
# ¿ Jun 18, 2012 20:39 |
|
Hay guys I'm back, this time with a lot more PowerShell under my belt and asking kind of an academic question. I've made a short little script that harvests tracking information from CSV files downloaded from an FTP server. checks the archive directory to see if the file already exists, and if it doesn't exist email the contents of the file somewhere and move it to the archive folder. It's basically (boring stuff cut out):code:
code:
|
# ¿ Feb 21, 2013 05:59 |
|
Titan Coeus posted:Try this: Well check that poo poo out. I had already tried that (realizing that Out-* was the only thing I could use) but it didn't work. But then I got the brainwave of setting the email HTML to $false, and sure enough it came through. Makes me wonder how much the HTML aspect was screwing up my other tries at it, and if it would have worked all along with HTML if I just thought to put '<PRE>' in front of it. Thanks Titan Coeus, you're worthy of stealing a pie from Estelle Getty.
|
# ¿ Feb 21, 2013 19:47 |
|
This should work but it doesn't, and the maddening thing is I've done this before. I'm running this dedupe finder: http://poshcode.org/4008 I want to pipe the output to a text file; I've done this in the past, and in fact after I make this post I'm going to click the 'find other posts by this user' button to see if I figured it out here. When I try this: code:
When I try this: code:
When I try this: code:
code:
|
# ¿ Aug 21, 2014 21:00 |
|
Hmm, thanks guys. The weird thing is, the date-time stamp on the script is from when I did this last, and I was going over 500,000 files, so I must have figured out a way to get that data without changing the actual script. After doing a bit of reading it makes sense that this won't work since write-host is meant for screen only. I guess my memory was playing tricks on me, which is probably the worst thing to rely on. You probably spend more time trying to do something "because I remembered it was easy last time" than you do trying it the right way.
|
# ¿ Aug 21, 2014 21:49 |
|
CLAM DOWN posted:Hey guys, I have to transfer a large number (>15,000,000) of small files totaling about 2TB from a 2003 to 2008 R2 server, maintaining folder structure and NTFS ACLs. Should I stick with robocopy, or would Copy-Item suit me fine here? Not powershell necessarily but doesn't winrar preserve permissions? Sounds like it'd be easier to copy one big file than lots of little ones.
|
# ¿ Sep 18, 2014 21:46 |
|
Yeah task scheduler is a big fuckup in all respects really, paths, quotation marks, permissions, all of it. It's amazing how they tried to duplicate the functionality of chron and make it even more obtuse to use.
|
# ¿ Sep 24, 2014 23:03 |
|
Karthe posted:I'm trying to write a quick Powershell batch script that calls a couple of other Powershell scripts that I've written. These other scripts output Japanese characters out to the console, and while console output looks fine when I run the script files by themselves, when I run them via my batch script all of the Japanese characters are replaced by question marks. How can I preserve UTF-8 output when I'm calling my scripts? That's a weird one, this might help: http://stackoverflow.com/questions/22349139/utf-8-output-from-powershell According to one of those guys, it's actually a bug that it doesn't work in UTF16 by default.
|
# ¿ Jan 5, 2015 21:42 |
|
myron cope posted:Is it possible to schedule a powershell task that runs without launching a window? I just want it to do its thing, no window needed What OS? I have a bunch of Scheduled Tasks on Windows Server 2003 (oh god don't remind about EOL) that just run in the background. With these settings: Account: (some specific service account) Run whether user is logged on or not Run with highest privileges Action: Start a program PowerShell .\MyScript.ps1 c:\jobs\
|
# ¿ Jan 22, 2015 01:00 |
|
myron cope posted:It's a Windows 8.1 machine. It's not a huge deal, but I don't need to see the output so if it's possible to hide I would. I just tried the same setup on Windows 7 and I think Braintist is right; there was no popup and I didn't bother messing with a window style.
|
# ¿ Jan 22, 2015 05:04 |
|
Might as well post this here. I'm hoping to automate grabbing some shopify product information via their API and dump/display it somewhere so it can be compared against a database. This is mostly half-assed together from other people's scripts, and the first part works (connecting/retrieving) and what errors out is actually using/displaying what comes down in any usable fashion:code:
The JSON is 'valid' in the sense that I normally take this output and manually parse it in MSSQL using JSON_VALUE/OPENJSON, but powershell doesn't seem to like it saying: code:
Ideally I'd like the output to look like: SKU : fulfillment_service 1 : yes 2: no etc etc. if anyone wants to make it even fancier with say CSV-output or the like.
|
# ¿ May 7, 2019 21:51 |
|
Hey! Thanks guys; The Fool especially, I quite liked seeing the iteration of outputs and seeing the differences between them. I do believe the prevailing theory here is correct, I was trying to do some things but after the ConvertFrom-JSON when I should have been manipulating it natively as the PSO object from the Rest-Invoke method. I've got "an" output now, just need to tweak things so it is "the" output I want. I'm also doing some array within array stuff, so I foresee some for each coming up in my future.
|
# ¿ May 8, 2019 18:32 |
|
Powershell is p cool. I haven't messed with it seriously in quite a while but in a couple hours I just set up: - A scheduled task that runs every 10 mins - Checks when last run successfully (using Get-ScheduledTask) - Builds an SQL query string using date stamp acquired from above if greater than 10 minutes - Connect to logging database and get info (using System.Data.SqlClient.SQLConnection) - Parse results, build string out of exceptions - If exception string <> "" log event to EventViewer (using Write-EventLog) - If exception string <>"" email exceptions (using Net.Mail.SmtpClient and System.Net.Mail.MailMessage) - If exception string <>"" post exceptions to dedicated Slack channel (using Invoke-RestMethod and hooks.slack.com) It's real neat!
|
# ¿ Jun 24, 2019 23:03 |
|
So here's a thing. I'm building JSON fulfillment requests from a CSV file. It's not exact, but it's roughly this:code:
But what I've been told now, is that the data isn't normalized very well. I might have cases where order_id is not unique, so like: code:
- All SKUs belonging to an order_id and - Sum up Qty when a SKU is indicated for that order_id over multiple rows (e.g. BLEH-X above should be one entry for Qty 2) I would in turn use this information to make one JSON request for the order, that keeps adding to tracking_number / items array for that order_id instead of multiple individual requests against the same order_id the current method would do. I'm pretty sure I'd know how to do this in C#/VB, but am wondering if there's some wow neato thing like | Combine-ByKey "order_id" or something like that in PowerShell I'm not aware of. If it helps, I can ensure that the CSV file is sorted by order_id, e.g. the next row would always be either the same order or a new one. EDIT-And if you want to make cool pre/code tables like the above you can go here: https://github.com/ozh/ascii-tables EDIT EDIT-Hmm Group-Object -> Select-Object Scaramouche fucked around with this message at 08:38 on Jun 26, 2019 |
# ¿ Jun 26, 2019 08:28 |
|
Does AWS Powershell support you know, actually querying the table in some fashion? Or is it just a meta-organizer tool? I'm part through hooking one up (I can't seem to be able to provide a sort key that satisfies "the provided key element does not match the schema") and I saw this on a related StackOverflow: code:
|
# ¿ Jul 3, 2019 22:07 |
|
I sort of got it working using aws-cli but as mentioned the documentation is pretty dire for the powershell specific stuff. I got like 90% of the way there and will probably revisit it using native PS objects; the biggest stumbling block to me is that you must use the existing indexes as part of any query object, and it never really specifies where/how you should do that, so some random guess and test was involved. I'd probably have it working in PS, but one of the bright sparks who made the table (and didn't make any useful indexes) made one of the columns a reserved AWS keyword, and normal Powershell escaping doesn't work for getting around that. I think there might be something to this guy's approach but I gave up fixing it halfway through as the impression I get is that it's no longer current: https://www.powershellgallery.com/packages/domainAwsPowershellTools/1.0.2/Content/domainAwsPowershellTools.psm1
|
# ¿ Jul 8, 2019 19:56 |
|
|
# ¿ Apr 26, 2024 01:13 |
|
Had a fun thing that I nearly came into this thread to ask about. I've been using the type three letters then tab autocomplete in PowerShell ISE to run my various .ps1's. Turns out it will pick other extensions first if they have the same name... I was wondering why that .CSV file kept opening in excel every time I tried to run the script!
|
# ¿ May 12, 2020 01:03 |