Hello StackOverflow members!
I am trying to run the following command:
REM the below line lists the folder names that are to be read
FOR /F "TOKENS=* DELIMS=" %%d in (%start_dir%\folder_list.txt) DO (
ECHO Entering into: %%d Directory
REM The below line lists the folders and all of it's subfolders. It than outputs it to a file.
FOR /F "TOKENS=* DELIMS=" %%e in ('DIR /s "%work_dir%\%%d"') DO (
ECHO %%e>>%start_dir%\tmp_folder\%%d.size
)
)
The code above works.
Here is the problem: If I have folder that is only a few GB in size, it is fine.
If I have a folder that is above 100GB the script will take about an hour to output the DIR /S>>%%d command.
When I do a run on an individual folder that is about 150GB: Dir /s "150GB_Folder">>dir_ouput_file.txt It completes in about 6-10 seconds.
My question is: Why does it take an hour to output the the DIR /S>>whatever.txt from within a script and it only takes seconds when it is not in a script?
Thank you in advance!
It is a bug in for
where parsing a large number of lines with a command causes huge delays.
The solution is to create a file with the information and then read the file.
REM the below line lists the folder names that are to be read
FOR /F "TOKENS=* DELIMS=" %%d in (%start_dir%\folder_list.txt) DO (
ECHO Entering into: %%d Directory
REM The below line lists the folders and all of it's subfolders. It than outputs it to a file.
DIR /s "%work_dir%\%%d" >%temp%\temp.tmp
FOR /F "TOKENS=* DELIMS=" %%e in (%temp%\temp.tmp) DO (
ECHO %%e>>%start_dir%\tmp_folder\%%d.size
)
del %temp%\temp.tmp
)
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With