Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Regarding Memory Consumption of CouchDB

I did some evaluations on CouchDB recently. I found that memory consumption is pretty high for view construction (map & reduce) as well as importing a larger JSON document into CouchDB. I evaluated the view construction function on a Ubuntu system (4 cores, Intel® Xeon® CPU E3-1240 v5 @ 3.50GHz). Here are the results:

  1. four hundred 100KB datasets would cost around 683 MB memory;
  2. one 80 MB dataset would cost around 2.5 GB memory;
  3. four 80 MB datasets would cost around 10 GB memory.

It seems that memory consumption is hundreds of times of original JSON dataset. If we use 1 GB dataset, then CouchDB would run out of the memory. Does anyone know the reason why memory consumption is so huge? Many thanks!

like image 476
Jack Avatar asked Oct 19 '17 03:10

Jack


Video Answer


2 Answers

I don't know why the memory is so high, but I know it's consistent with CouchDB and you can't really get around it as long as you have large document sizes. I eventually split out the data that I wanted to build views on and then kept the full documents in a separate database for later extraction.

like image 99
MitchB Avatar answered Sep 28 '22 02:09

MitchB


I know that late to answer but I'll leave this answer for someone to benefit. Actually, it's about the caching responses. Couchdb wants to cache the responses to return the results faster. You can handle the issue by setting the caching limits.

Check it: https://docs.couchdb.org/en/latest/config/couchdb.html

like image 28
Emir Cangır Avatar answered Sep 28 '22 04:09

Emir Cangır