I am using my department's computing cluster with Sun Grid Engine.
When I have to run multiple R jobs, I usually write shell script files with names s01.sh, s02.sh,...,s50.sh which have 'R CMD BATCH r01.r','R CMD BATCH r02.r',...,'R CMD BATCH r50.r' as its contents.
Then I open 'PUTTY', log in, and then have to type 'qsub s01.sh', 'qsub s02.sh'....etc.
If there are hundreds of jobs, it is a real labor to manually type hundred of jobs. Is there a way to run this multiple 'qsub' commands simultaneously?
Assuming the scripts to be run are in the current folder:
for file in s*.sh; do qsub $file; done
I think you just need to run the qsub commands sequentially, since qsub itself should be fairly quick. (The submitted commands will probably run in parallel.)
You just need a loop.
Assuming you've already created the r*.r files, this is easy to do with a small shell script:
#!/bin/bash
for file in r*.r ; do
script=$(echo $file | sed 's/^r/s/;s/\.r$/.sh/')
(
echo "#/bin/sh"
echo "R CMD BATCH $file"
) > $script
chmod +x $script
qsub $script
done
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With