There is a directory that is being served over the net which I'm interested in monitoring. Its contents are various versions of software that I'm using and I'd like to write a script that I could run which checks what's there, and downloads anything that is newer that what I've already got.
Is there a way, say with wget
or something, to get a a directory listing. I've tried using wget
on the directory, which gives me html. To avoid having to parse the html document, is there a way of retrieving a simple listing like ls
would give?
If you are expecting to see an existing site however, and instead you see the directory listing something has gone wrong somewhere. The file may have been infected via a php injection attack and removed by a malware scanner. The site may have been attacked. Your account password may not be very secure.
Now, to generate a quick listing, just right-click a folder and choose the “Open in Directory List + Print” command from the context menu. You can also drag and drop a directory from Windows Explorer onto the program window to quickly generate a listing of that directory.
Use the dir tag in HTML to display directory list. This is very similar to <ul> tag but do not use <dir> since it is deprecated now.
I just figured out a way to do it:
$ wget --spider -r --no-parent http://some.served.dir.ca/
It's quite verbose, so you need to pipe through grep
a couple of times depending on what you're after, but the information is all there. It looks like it prints to stderr, so append 2>&1
to let grep
at it. I grepped for "\.tar\.gz" to find all of the tarballs the site had to offer.
Note that wget
writes temporary files in the working directory, and doesn't clean up its temporary directories. If this is a problem, you can change to a temporary directory:
$ (cd /tmp && wget --spider -r --no-parent http://some.served.dir.ca/)
What you are asking for best served using FTP, not HTTP.
HTTP has no concept of directory listings, FTP does.
Most HTTP servers do not allow access to directory listings, and those that do are doing so as a feature of the server, not the HTTP protocol. For those HTTP servers, they are deciding to generate and send an HTML page for human consumption, not machine consumption. You have no control over that, and would have no choice but to parse the HTML.
FTP is designed for machine consumption, more so with the introduction of the MLST
and MLSD
commands that replace the ambiguous LIST
command.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With