I'm wondering if there is any tool to "download" the javadocs from an online host to a local file.
The online docs I am using tend to reject clients such as eclipse, making work difficult, so I need to pull them onto my machine and attach them to my library jar.
First, make sure they don't already offer an download in zip form or similar.
Then, make sure you are actually allowed to do this (this may depend on where you live, and on any conditions mentioned on the web site from where you want to pull this).
Then, have a look at the Wget tool. It is part of the GNU system, thus included in many Linux distributions, but also available for Windows and Mac, I suppose.
Something like this works for me:
wget --no-parent --recursive --level inf --page-requisites --wait=1 \
https://epaul.github.io/jsch-documentation/simple.javadoc/
(without the line break; it should be escaped by the \
backslash here).
Look up what each option does in the manual before trying this.
If you want to do this repeatedly, look into the --mirror
option.
For downloading other websites, --convert-links
might also be useful, but I found that is not needed for Javadocs, which usually have the correct absolute and relative links.
This downloads lots of the same copy of the index.html
file with appended ?...
names (for the FRAMES links on each page). You can remove these files after downloading by adding the --reject 'index.html\?*'
option, but they still will be downloaded first (and checked for recursive links). I did not yet find out how to avoid downloading them at all. (See this related question on Serverfault.)
Maybe adding the right recursion level would help here (I didn't try).
After downloading, you might want to zip the resulting directory to take less disk space. Use the zip tool of your choice for this.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With