I have a worker role that I use to pull data down from Blob Storage OnStart. Currently I'm testing this by uploading a test.txt file and then downloading it to a local directory. This is working fine.
I now would like to upload a folder to blob storage. This folder contains a batch script as well as several executables that the batch script calls.
What is the recommended way to accomplish this? I think zipping the folder and uploading the *.zip file would be easy... but then, once I download it locally for the worker role to handle, how would I unzip it without any third party libraries?
If there are better options, I'm open to any suggestions. Thanks for the help here - this community has been a huge help for me as I ramp up :)
Upload a folder to a blob containerOn the main pane's toolbar, select Upload, and then Upload Folder from the drop-down menu.
You can upload files and directories to Blob storage by using the AzCopy v10 command-line utility.
You can use upload-batch
:
az storage blob upload-batch --destination ContainerName --account-name YourAccountName --destination-path DirectoryInBlob --source /path/to/your/data
This copies all files found in the source directory to the target directory in the blob storage.
Either a SAS-Token (via --sas-token
) or account key has to be specified.
Also works smoothly with a service principal.
login to azure cli using az login
To upload file use
az storage blob upload additional-params
To upload a folder use
az storage blob upload-batch additional-params
Refer here for complete commands
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With