The scenario is in a remote server, a folder is shared for ppl to access the log files.
The log files will be kept for around 30 days before they got aged, and each day, around 1000 log files will be generated.
For problem analysis, I need to copy log files to my own machine, according to the file timestamp.
My previous strategy is:
- Use
dir /OD
command to get the list of the files, to my local PC, into a file
- Open the file, find the timestamp, get the list of the files I need to copy
- Use copy command to copy the actual log files
It works but needs some manual work, ie step2 I use notepad++ and regular expression to filter the timestamp
I tried to use powershell as:
Get-ChildItem -Path $remotedir | where-object {$_.lastwritetime -gt $starttime -and $_lastwritetime -lt $endtime } |foreach {copy-item $_.fullname -destination .}
However using this approach it took hours and hours and no file has been copied, while compared with the dir solution it took around 7-8 minutes to generate the list of the file than copy itself took sometime but not hours
I guess most of the time spent on the filter file. I'm not quite sure why the get-childitem's performance is so poor.
Can you please advise if there's anything i can change?
Thanks
question from:
https://stackoverflow.com/questions/65834866/powershell-get-childitem-performance-to-deal-with-bulk-files 与恶龙缠斗过久,自身亦成为恶龙;凝视深渊过久,深渊将回以凝视…