I have an interesting case. I need to upload tar file, splitted by linux split
directly to S3 bucket. I have no enough disk space and time for creating one big .tar file and it's next splitting, so I'm trying to use pipelines:
tar -cvf - example-file.log | split -b 10k -d - "tarpart-" | aws s3 cp - s3://mybucket/
Unfortunately, I have no parts of my file in S3, just only:
aws s3 ls s3://mybucket
2019-02-14 13:07:38 0 -
I'm not sure if there a way to upload multiple files with undefined names, but maybe someone had the same issue?
Finally I've found the solution:
# tar -cvf - install.post.log | split -d -b 4k -a 4 - splitted_ --filter='aws s3 cp - s3://testb/$FILE'
install.post.log
# aws s3 ls s3://testb
2019-02-14 14:49:38 40960 -
2019-02-14 14:55:09 4096 splitted_0000
2019-02-14 14:55:10 4096 splitted_0001
2019-02-14 14:55:11 4096 splitted_0002
2019-02-14 14:55:11 4096 splitted_0003
2019-02-14 14:55:12 4096 splitted_0004
2019-02-14 14:55:13 4096 splitted_0005
2019-02-14 14:55:13 4096 splitted_0006
2019-02-14 14:55:14 4096 splitted_0007
2019-02-14 14:55:14 4096 splitted_0008
2019-02-14 14:55:15 4096 splitted_0009
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With