I may fall into a X-Y problem with this question and I encourage you guys to correct me if I am wrong.
I would like to configure a toolchain environment that can work on different platforms and compiler versions. I initially wrote a long Perl script that generates a configuration Makefile that contain only variables. I wanted to be simple so I did not write anything complex using automake or autoconf. Moreover I wanted the reconfiguration process to be very fast. In my case my own written ./configure
does everything in less than a second. I am very happy with that.
However I feel I can use a better approach using environment variables. Instead of writing a Makefile with the specific variables I can set the current shell environment directly. For example:
export cc=gcc
Unfortunately, some variables are already declared in the $PATH. The solution is to add the new $PATH in the front of the other:
export PATH=/new/toolchain/path:$PATH
echo $PATH
/new/toolchain/path:/old/toolchain/path:/usr/bin:/bin...
I feel this is ugly I would like to remove the old path before adding the new one.
To conclude:
When I have several variables to set, I write a wrapper script which I then use as a prefix to the command that I want to modify. That lets me use the prefix either
make
, orI use wrappers for
clang
, to set the CC
variable, making configure scripts "see" it as the chosen compiler),C
versus en_US
versus en_US.UTF-8
, etc.cron
.Each of the wrappers does what is needed to identify the proper PATH
, LD_LIBRARY_PATH
, and similar variables.
For example, I wrote this ad hoc script about ten years ago to test with a local build of python:
#!/bin/bash
ver=2.4.2
export TOP=/usr/local/python-$ver
export PATH=$TOP/bin:$PATH
export LD_LIBRARY_PATH=`newpath -n LD_LIBRARY_PATH -bd $TOP/lib $TOP/lib/gcc/i686-pc-linux-gnu/$ver`
if test -d $TOP
then
exec $*
else
echo no $TOP
exit 1
fi
and used it as with-python-2.4.2
myscript.
Some wrappers simply call another script. For example, I use this wrapper around the configure script to setup variables for cross-compiling:
#!/bin/sh
# $Id: cfg-mingw,v 1.7 2014/09/20 20:49:31 tom Exp $
# configure to cross-compile using mingw32
BUILD_CC=${CC:-gcc}
unset CC
unset CXX
TARGET=`choose-mingw32`
if test -n "$TARGET"
then
PREFIX=
test -d /usr/$TARGET && PREFIX="--prefix=/usr/$TARGET"
cfg-normal \
--with-build-cc=$BUILD_CC \
--host=$TARGET \
--target=$TARGET \
$PREFIX "$@"
else
echo "? cannot find MinGW compiler in path"
exit 1
fi
where choose-mingw32
and cfg-normal
are scripts that (a) find the available target name for the cross-compiler and (b) provide additional options to the configure script.
Others may suggest shell aliases or functions. I do not use those for this purpose because my command-line shell is usually tcsh
, while I run these commands from (a) other shell scripts, (b) directory editor, or (c) text-editor. Those use the POSIX shell (except of course, for scripts requiring specific features), making aliases or functions of little use.
You can create an individualized environment for a particular command invocation:
VAR1=val1 VAR2=val2 VAR3=val3 make
I find this cleaner than doing:
export VAR1=val1
export VAR2=val2
export VAR3=val3
make
unless you're in a wrapper script and maybe even then as with
VAR1=val1 VAR2=val2 VAR3=val3 make
the VAR
variables will be whatever they were before the make invocation (including but not limited to unexported and nonexistent).
Long lines is a non-issue, you can always split it across several lines:
VAR1=val1\
VAR2=val2\
VAR3=val3\
make
You can set up environment variables like this for any Unix command.
The shell will all set it up.
Some applications (such as make
or rake
) will modify their environment based on arguments that look like variable definitions (see prodev_paris's answer), but that depends on the application.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With