When I use R open source, if not using a specific package, it's not possible handle data sets bigger than RAM memory. So I would like to know if it's possible handle big data sets applying PL/R functions inside PostgreSQL.
I didn't found any documentation about this.
There is no PostgreSQL-imposed limit on the number of indexes you can create on a table. Of course, performance may degrade if you choose to create more and more indexes on a table with more and more columns. PostgreSQL has a limit of 1GB for the size of any one field in a table.
Memory. The 2GB of memory is a recommendation for memory you can allocate to PostgreSQL outside of the operating system. If you have a small data set, you are still going to want enough memory to cache the majority of your hot data (you can use pg_buffercache to determine your hot data).
A Redis in-memory database cache allows you to access frequently used data from your server's RAM. This minimizes unnecessarily load and roundtrips to your PostgreSQL database server.
shared_buffers (integer) Sets the amount of memory the database server uses for shared memory buffers. The default is typically 32 megabytes (32MB), but might be less if your kernel settings will not support it (as determined during initdb). This setting must be at least 128 kilobytes.
As mentioned by Hong Ooi, PL/R loads an R interpreter into the PostgreSQL backend process. So your R code is running "in database".
There is no universal way to deal with memory limitations, but there are least two possible options:
See PL/R docs here: http://www.joeconway.com/plr/doc/index.html
I am guessing what you would really like to have is a data.frame in which the data is paged to and from an underlying database cursor transparently to your R code. This is on my long term TODO, but unfortunately I have not been able to find the time to work it out. I have been told that Oracle's R connector has this feature, so it seems it can be done. Patches welcomed ;-)
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With