There is a project where each user gets to download a zip containing around 2GB of data...
The only problem- is that there are a couple very small files which must change in this zip per-user.
Is there an elegant way to solve this, asides from not requiring it all be in the zip? Ideas I've considered:
1) Pushing pending orders onto a queue, and processing that queue when resources are available... processing will mean creating a new zip for each order, and then deleting it after N days
2) Manipulating the zip live in PHP somehow, before sending via a raw sort of push (i.e. spitting out the header, and then generating the data based on the files + custom files)
Any ideas for best-approach or memory issues I might encounter? Thanks!
The ZIP file structure is basically:
This means that you should be able to construct and output the ZIP archive on-the-fly, requiring only the directory data to be retained in memory, until you can write it out at the end. The ZIP archive itself will never need to exist on disk.
If you use this approach, there will be no concurrency issues in offering the ZIP file to multiple clients at once, and you won't have to use any disk space when constructing the archive.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With