Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

100 TB of data on Mongo DB? Possible?

What kind of an architecture is needed to store 100 TB data and query it with aggregation? How many nodes? Disk size per node? What can the best practice be?

Every day 240GB will be written but the size will remain same because the same amount data will be deleted.

Or any different thoughts about storing the data and fast group queries?

like image 924
canseverayberk Avatar asked Jan 22 '13 07:01

canseverayberk


People also ask

How much data can MongoDB handle?

The maximum size an individual document can be in MongoDB is 16MB with a nested depth of 100 levels. Edit: There is no max size for an individual MongoDB database.

Can I use MongoDB for big data?

MongoDB stores huge amounts of data in a naturally traversable format, making it a good choice to store, query, and analyze big data.

Can MongoDB handle millions of records?

Working with MongoDB and ElasticSearch is an accurate decision to process millions of records in real-time. These structures and concepts could be applied to larger datasets and will work extremely well too.


2 Answers

Kindly refer to related question,

MongoDB limit storage size?

Quoting from the the top answer:

The "production deployments" page on MongoDB's site may be of interest to you. Lots of presentations listed with infrastructure information. For example:

http://blog.wordnik.com/12-months-with-mongodb says they're storing 3 TB per node.

like image 155
Samuel Liew Avatar answered Sep 22 '22 06:09

Samuel Liew


I highly recommend HBase.

Facebook uses it for its Messages service, which in Nov 2010 was handling 15 billion messages a day.

We tested MongoDB for a large data set but ended up going with HBase and have been happily using it for months now.

like image 22
Suman Avatar answered Sep 25 '22 06:09

Suman