Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Storing a large number of images

I'm thinking about developing my own PHP based gallery for storing lots of pictures, maybe in the tens of thousands.

At the database I'll point to the url of the image, but here's the problem: I know is impractical to have all of them sitting at the same directory in the server as it would slow access to a crawl, so, how would you store all of them? Some kind of tree based on the name of the jpeg/png?

What rules to partition the images would you recommend me?

(It will focused for using in cheapo dot coms, so no mangling with the server is possible)

like image 436
Saiyine Avatar asked Jan 15 '09 10:01

Saiyine


People also ask

How would you most efficiently store large images in a database?

The large images should be stored in something like AWS S3, HDFS, a Content Delivery Network (CDN), a web server, file server or whatever else would be great a serving up large static objects, in accordance with your use case and budget.


1 Answers

We had a similar problem in the past. And found a nice solution:

  • Give each image an unique guid.
  • Create a database record for each image containing the name, location, guid and possible location of sub images (thumbnails, reducedsize, etc.).
  • Use the first (one or two) characters of the guid to determine the toplevel folder.
  • If the folders have too much files, split again. Update the references and you are ready to go.
  • If the number of files and the accesses are too high, you can spread folders over different file servers.

We have experienced that using the guids, you get a more or less uniform division. And it worked like a charm.

Links which might help to generate a unique ID:

  • http://en.wikipedia.org/wiki/Universally_Unique_Identifier
  • http://en.wikipedia.org/wiki/Sha1
like image 197
Toon Krijthe Avatar answered Oct 16 '22 11:10

Toon Krijthe