Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

How can I make R take advantage of a high memory, high cpu server?

Tags:

SO I've finally figured out how to get my R scripts to run on the Amazon EC2 cloud. I've been using an AMI with 26 ECUs, 8 Cores, and 69 gigs of RAM.

I then divide up my code into multiple scripts, and run each one in an instance of R. With a server of this size, I can easily run 20-40 scripts simultaneously, each running several 1000 simulations.

What I would like to know is if R is taking advantage of all this computing power natively. Should I install packages that specifically tell R to use all this extra memory/ multiple CPUs? I've seen this page and some packages (at least from the description) seem promising. But I am unable to figure out how to incorporate this into my code. Could anyone shed more light on this?

like image 729
Maiasaura Avatar asked Jun 22 '10 19:06

Maiasaura


1 Answers

You could look at the examples in my the Intro to High-Performance Computing with R tutorials of which a few versions are on this page.

The quickest way to use the multiple cores is the (excellent) multicore package, you should not have anything special to do to take advantage of the oodles of ram you have there. multicore ties into foreach via doMC, but you can of course simply use the mclapply() function directly.

like image 146
Dirk Eddelbuettel Avatar answered Sep 16 '22 18:09

Dirk Eddelbuettel