I need to be able to upload several files to a database, and associate them with a given record in a given table. I initially thought about zipping all the necessary files and send the resulting byte array as a parameter along with the remaining record data to a stored procedure, so that I could both be sure that this acts as a single transaction and that retrieving all the files associated with a given record would be a piece of cake. However, this plan was not accepted for performance reasons, and now I am left with two options:
Create a data structure which wraps an array of byte[], (binary-)serialize it and send it along with the data; or
Send the remaining data first, catch the ID of the record and send each file separately (associating it with the given ID);
Now, 1) looks too much like my first plan and is likely to be rejected even though this time no compression would be involved. Option 2) seems to be the way to go, but, however, I must guarantee that I upload both the record's data and the files in a single transaction, which will force me to make changes (albeit slight) in my data layer...
If you were facing this problem, which option would you choose? If none of those I stated above, which one then?
Edit: The database is located in a remote server. The files can be arbitrarily large, although I don't expect them to be larger than, say, 1-2MB. Storing the files in a location accessible by both the application and the database server is not an option, so that I can't just send paths to the files to the database (they must really be stored as BLOBs in the database).
Your first plan was rejected due to performance issues - presumably this means network transfer speed?
Your alternate plan is to send the files individually & then tie them together on your server. That's a fine plan, but you want to ensure all the files are uploaded and committed in a single transaction. Following from that, your server can't respond with a success signal until it's received all the files and successfully committed them.
So you second scenario requires the same amount of data to be uploaded in the transaction. How then will it be able to perform any better?
In fact, any possible design you can come up with - assuming the files need to be received within a single transaction - will have the same 'performance issue'.
You may have a different understanding of 'transaction' than I do, but typically you can't go and use the originally submitted 'remaining data' until the transaction is complete. I think your problem is around the required 'single transaction' nature of your system, rather than any network bottleneck.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With