Is there an HTTP client like wget/lynx/GET
that is distributed by default in POSIX or *nix operating systems that could be used for maximum portability?
I know most systems have wget
or lynx
installed, but I seem to remember installing some Ubuntu server systems using default settings and they had neither wget
or lynx
installed in the base package.
I am writing a shell script for Linux (and probably Mac) to install a piece of software onto the computer. To prevent having to distribute a couple of large files, I would like to fetch these files from the internet instead of packaging in with the installer. Currently, the install script is to be distributed in a single file created from Makeself.
I'd like to avoid having the install script be over 100 MB which it would be if the files were included, and also they may not be required if the person is upgrading or re-installing the software. Maybe the most portable thing to do is include the files in the pacakage.
Right now I am just thinking of having the script check for wget
, lynx
, and GET
, in that order and it will use whichever one it can for downloading, but I could avoid this altogether if there was a way I could download the files that would work on all systems.
EDIT:
Does anyone know much about lwp-request (GET
) and its availability? This seems to be readily available on several of the systems I have checked so far, and I remember this always being around 10+ years ago going back to RedHat.
Most Linux distributions have wget installed by default. To check whether it is installed on your system or not, type wget on your terminal and press enter. If it is not installed, it will display “command not found” error.
Neither curl nor wget are "guaranteed" to be installed anywhere, especially on proper UNIX systems. They are not POSIX standards. Neither is ftp, ssh / scp / sftp, rsync, telnet, nc / netcat, openssl, or probably any related tool that comes to mind.
By default, the Wget package comes pre-installed in most Linux operating systems. If not installed, you can install it using either the APT or YUM command-line utility (depending on your Linux distribution). Output: GNU Wget 1.15 built on linux-gnu.
Edit in 2019-11-04: I'm rewriting my answer to reflect the importance of ensuring that a transfer isn't tampered with while in flight. I'll leave my original answer below the rule.
I suggest using rsync
over ssh
to transfer your files. rsync
's interface may look overwhelming, but most users may be able to pick rsync -avzP
, and if you need more flexibility, rsync can adapt. Using ssh
will provide integrity, authenticity, and privacy to your connection.
curl
is the de facto standard for http transfers; if plain http or https are preferred, curl
or tools based on curl
are probably a good choice.
In my experience, tools are available about in this order:
wget
curl
sftp
ftp
GET
(I use HEAD
all the time and often forget it is just one tool in the suite)tftp
nc
(not as common as I wish)socat
(even less common)The bash
/dev/tcp
tool is available on most systems I've used (some used dash
or pdksh
instead), but using echo
with bash
, nc
, or socat
is going the long-way-around for HTTP access -- you'll have to handle headers somehow, which reduces its elegance.
Official list of POSIX 7 utilities
http://pubs.opengroup.org/onlinepubs/9699919799/utilities/contents.html
The following are not present in the list:
The same goes for the LSB which essentially only guarantees the POSIX utilities.
But I do think that the POSIX C is enough to implement most of netcat
functionality, so it is really a missed opportunity. E.g.: How to make an HTTP get request in C without libcurl?
Likely it is because network protocols like HTTP were deemed too specific/didn't exist at the time POSIX was still evolving, and POSIX basically froze forever. Notably, HTTPS encryption is likely not trivial to implement.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With