I need to download a project from SourceForge, but there is no easy visible way. Here, on this picture (linked down, not enough reputation), it is possible to download "the latest version", which does include only files from first folder, but I need to download other folder.
It is possible to download these files, but only manually and because there are hundreds of files and subfolders - it would be quite impractical.
Does anyone know any way to download it? I didn't find much, only some mentioned wget, but I tried it without any success.
Link: http://s9.postimg.org/xk2upvbwv/example.jpg
replace URL with your RSS link for example : "https://sourceforge.net/projects/xdxf/rss?path=/dicts-babylon/001", and watch the magic happens, The RSS link will include all the files of the Sourceforge folder or project and it's sub-folders, so the script will download everything recursively.
Sometimes there is a download link at the summary tab, but sometimes I don't know a work around so I use this piece of code:
var urls = document.getElementsByClassName('name')
var txt = ""
for (i = 0; i < urls.length; i++) {
txt += "wget " + urls[i].href +"\n"
}
alert(txt)
You should open a console in your browser on the page where all the files are listed. Copy+past+enter the code and you will be prompted a list of wget
commands which you can Copy+past+enter in your terminal.
In every Sourceforge project or project folders page there is an RSS link, as you can see in the example screenshot here.
right click that RSS icon in the page of the folder or project you want to download then copy the link and use the following Bash script:
curl https://sourceforge.net/projects/xdxf/rss?path=/dicts-babylon/001 | grep "<link>.*</link>" | sed 's|<link>||;s|</link>||' | while read url; do url=`echo $url | sed 's|/download$||'`; wget $url ; done
replace "https://sourceforge.net/projects/xdxf/rss?path=/dicts-babylon/001" with your RSS link, and watch the magic happens, The RSS link will include all the files of the Sourceforge folder or project and it's sub-folders, so the script will download everything recursively.
Good luck
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With