Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Predict memory usage in R

Tags:

memory

r

I have downloaded a huge file from the UCI Machine learning Dataset library. (~300mb).

Is there a way to predict the memory required to load the dataset, before loading it into R memory?

Googled a lot, but everywhere all I could find is how to calculate memory with R-profiler and several other packages, but after loading the objects into R.

like image 638
Novneet Nov Avatar asked Sep 04 '14 20:09

Novneet Nov


1 Answers

based on "R programming" coursera course, U can calculate the proximate memory usage using number of rows and columns within the data" U can get that info from the codebox/meta file"

memory required = no. of column * no. of rows * 8 bytes/numeric

so for example if you have 1,500,00 rows and 120 column you will need more than 1.34 GB of spare memory required

U also can apply the same approach on other types of data with attention to number of bytes used to store different data types.

like image 160
mgamal Avatar answered Oct 23 '22 17:10

mgamal