I want to run the following bash script, that is stored in an Elisp string, not in a .sh
file, then store the shell output in a variable.
#!/bin/bash
IFS=: read -ra _dirs_in_path <<< "$PATH"
for _dir in "${_dirs_in_path[@]}"; do
for _file in "${_dir}"/*; do
[[ -x ${_file} && -f ${_file} ]] && printf '%s\n' "${_file##*/}"
done
done
I couldn't run shell-command
on bash scripts, consisting of multiple strings. Emacs and Long Shell Commands didn't help me either, as compile
and comint-run
also require commands, not bash syntax.
How do i run a complex bash script from Elisp?
Using a backslash \ or « to print a bash multiline command If you're writing a multiline command for bash, then you must add a backslash (\) at the end of each line. For multiline comments, you have to use the HereDoc « tag.
you can either call the scripts in the loop via su , e.g. PATH="/opt/db/dbadmin/script" SCRIPTS=( "test0.sh args ..." "test1.sh args ..." ) # use array here due to the spaces in commands for scr in "${SCRIPTS[@]}"; do out=$(su - -c "$PATH/$scr" 2>&1); rc=$?
Running Commands in Parallel using Bash Shell The best method is to put all the wget commands in one script, and execute the script. The only thing to note here is to put all these wget commands in background (shell background). See our simple script file below. Notice the & towards the end of each command.
Multiline commands are fine to provide as an argument to bash -c
if you quote them as you would any other shell argument that might contain shell metacharacters, e.g.:
(setq my-command
(concat "IFS=: read -ra dirs <<<\"$PATH\"\n"
"for dir in ${dirs[@]}; do\n"
" echo got dir \"$dir\"\n"
"done\n"))
(shell-command (format "bash -c %s" (shell-quote-argument my-command)))
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With