So, I want to run a python script in a remote machine. However, in the manual of the server, they say this:
You need to create this script and save it.
#!/bin/bash
#$ -V ## pass all environment variables to the job, VERY IMPORTANT
#$ -N run_something ## job name
#$ -S /bin/bash ## shell where it will run this job
#$ -j y ## join error output to normal output
#$ -cwd ## Execute the job from the current working directory
#$ -q lowmemory.q ## queue name
uptime > myUptime.${JOB_ID}.txt
echo $HOSTNAME >> myUptime.${JOB_ID}.txt
So if this script was called blast_AE004437.sh we could run the following to make all of those steps happen.
qsub my_script.sh
So, I'm assuming that I need to create a .sh file to run this and add all this commands to my original script. Is that it? Because I'm doing that and nothing happens. After all this commands I also add "python2.7" to load python. What I'm doing wrong? By the way, the output will come out in the same file or do I need to download a different file?
The file you listed is the beginning of a SGE run script
which is called by the job scheduler. qsub submits it to the scheduling
system. As soon as there is a free slot on a cluster machine, the
run script is called there.
I suggest you to call your own script in this file.
...
echo $HOSTNAME >> myUptime.${JOB_ID}.txt
cd /directory/of/your/script # change directory
python2.7 your_script.py arg1 arg2 ... >> output.${JOB_ID}.log # call your script
Often you also need to set the $PATH variable and $PYTHONPATH variable manually before you can call python.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With