Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Best tech to use for a database that stores large files

I am interested in finding candidate software that can help me build a program that will do this:

  • simple key-value store, with key being a hash, and value being a potentially large file (10-100mb. total dataset can easily run to 200gb and up)
  • very low volume of requests. maybe 1000 per hour, probably less
  • between 2x-5x more reads than writes
  • automatically remove data that hasn't been queried for a while to keep diskspace under control
  • it's ok for the system to lose data.
  • easy install / few dependencies / easy to make xplatform

Sofware like Redis and MongoDB seem like interesting candidates, but they also very much seem to try to solve the problem of efficiently dealing with many requests per second, usually powering websites. A requirement I do not have at all.

I am wondering if you know of a tool that would be a better match to the specific problem I am trying to solve.

like image 682
Lucas Meijer Avatar asked Nov 21 '25 14:11

Lucas Meijer


1 Answers

Based on your requirements, the simplest solution is to use the file system to store your data. Use hash key as file name.

Lookup will be efficient, and data will be cached in memory for you automatically.

If your file system supports it, you can do regular clean up based on last access time for each file.

like image 128
the_ien Avatar answered Nov 23 '25 06:11

the_ien



Donate For Us

If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!