I recently have found myself running batch processes scanning inventory on long lists of machines. To have more control I usually generate a list of IP's or machine names and use it as an input file to my process.
It is obvious after the fact, but I often don't think of it until I've wasted time on something taking too long -- things go faster if they are broken up into groups and processed in parallel.
I need to spend some time to make up a script to automate this, but what I do is:
- Create folders 0 - 9 beneath a process folder.
- Copy the script to each folder.
- Break up my input file of items to process into 10 equal in.txt files and put one in each folder.
- I generally have the script create an output file such as out.txt
- Run the script redirecting output to stdout.txt
- Tile them all on my second monitor and watch each for errors or a Complete! message.
- Run a script like the one below to consolidate the logs.
del temp\*.* /y
For /d %%p in (*) do copy "%%p\stdout.txt" "temp\%%p.log"
copy temp\*.log final\discovery.log
- Run a script like the one below to consolidate the output.
For /d %%p in (*) do copy "%%p\out.txt" "final\%%p.csv"