Is there some sort of character limit imposed in bash (or other shells) for how long an input can be? If so, what is that character limit?
I.e. Is it possible to write a command in bash that is too long for the command line to execute? If there is not a required limit, is there a suggested limit?
$1 means an input argument and -z means non-defined or empty. You're testing whether an input argument to the script was defined when running the script. Follow this answer to receive notifications.
So if your receiving application reads lines of input from its stdin and its stdin is a terminal device and that application doesn't implement its own line editor (like bash does) and doesn't change the input mode, you won't be able to enter lines longer than 4096 bytes (including the terminating newline character).
The -n option tells head to limit the number of lines of output. Alternatively, to limit the output by number of bytes, the -c option would be used.
In the other shells, to refer to parameters with numbers greater than 9, use the shift command; this shifts the parameter list to the left. $1 is lost, while $2 becomes $1, $3 becomes $2, and so on.
The limit for the length of a command line is not imposed by the shell, but by the operating system. This limit is usually in the range of hundred kilobytes. POSIX denotes this limit ARG_MAX
and on POSIX conformant systems you can query it with
$ getconf ARG_MAX # Get argument limit in bytes
E.g. on Cygwin this is 32000, and on the different BSDs and Linux systems I use it is anywhere from 131072 to 2621440.
If you need to process a list of files exceeding this limit, you might want to look at the xargs
utility, which calls a program repeatedly with a subset of arguments not exceeding ARG_MAX
.
To answer your specific question, yes, it is possible to attempt to run a command with too long an argument list. The shell will error with a message along "argument list too long".
Note that the input to a program (as read on stdin or any other file descriptor) is not limited (only by available program resources). So if your shell script reads a string into a variable, you are not restricted by ARG_MAX
. The restriction also does not apply to shell-builtins.
Ok, Denizens. So I have accepted the command line length limits as gospel for quite some time. So, what to do with one's assumptions? Naturally- check them.
I have a Fedora 22 machine at my disposal (meaning: Linux with bash4). I have created a directory with 500,000 inodes (files) in it each of 18 characters long. The command line length is 9,500,000 characters. Created thus:
seq 1 500000 | while read digit; do touch $(printf "abigfilename%06d\n" $digit); done
And we note:
$ getconf ARG_MAX 2097152
Note however I can do this:
$ echo * > /dev/null
But this fails:
$ /bin/echo * > /dev/null bash: /bin/echo: Argument list too long
I can run a for loop:
$ for f in *; do :; done
which is another shell builtin.
Careful reading of the documentation for ARG_MAX
states, Maximum length of argument to the exec functions. This means: Without calling exec
, there is no ARG_MAX
limitation. So it would explain why shell builtins are not restricted by ARG_MAX
.
And indeed, I can ls
my directory if my argument list is 109948 files long, or about 2,089,000 characters (give or take). Once I add one more 18-character filename file, though, then I get an Argument list too long error. So ARG_MAX
is working as advertised: the exec is failing with more than ARG_MAX
characters on the argument list- including, it should be noted, the environment data.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With