Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Download URL links using R

Tags:

r

I am new to R and would like to seek some advice.

I am trying to download multiple url links (pdf format, not html) and save it into pdf file format using R.

The links I have are in character (took from the html code of the website).

I tried using download.file() function, but this requires specific url link (Written in R script) and therefore can only download 1 link for 1 file. However I have many url links, and would like to get help in doing this.

Thank you.

like image 258
poppp Avatar asked Aug 24 '15 04:08

poppp


1 Answers

I believe what you are trying to do is download a list of URLs, you could try something like this approach:

  1. Store all the links in a vector using c(), ej:
urls <- c("http://link1", "http://link2", "http://link3")
  1. Iterate through the file and download each file:
for (url in urls) {
    download.file(url, destfile = basename(url))
}

If you're using Linux/Mac and https you may need to specify method and extra attributes for download.file:

download.file(url, destfile = basename(url), method="curl", extra="-k")

If you want, you can test my proof of concept here: https://gist.github.com/erickthered/7664ec514b0e820a64c8

Hope it helps!

like image 51
erickthered Avatar answered Sep 20 '22 18:09

erickthered