Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

How can I tar files larger than physical memory using Perl's Archive::Tar?

Tags:

archive

tar

perl

I'm using Perl's Archive::Tar module. Problem with it is it pulls everything on to the memory and the does archiving and then writes on to the file system so there is limitation on the maximum file size that can be archived. Most of the times it says out of memory. In case of GNU tar, it takes chunk of file, archives it and writes it on to the memory so it can handle files of any size. How can I do that using Perl's Archive::Tar module.

like image 835
Ram Avatar asked Mar 17 '09 05:03

Ram


2 Answers

It looks like Archive::Tar::Wrapper is your best bet. I've not tried it myself, but it uses your system's tar executable and doesn't keep files in memory.

Contrary to Chas. Owen's answer, Archive::Tar::Streamed does keep files in memory and does not use your system's tar. It actually uses Archive::Tar internally, but it processes one file at a time (taking advantage of the fact that tar archives can be concatenated). This means that Archive::Tar::Streamed can handle archives bigger than memory, as long as each individual file in the archive will fit in memory. But that's not what you asked for.

like image 135
cjm Avatar answered Sep 25 '22 19:09

cjm


It looks like there is a different module that doesn't use an in-memory structure: Archive::Tar::Streamed. The downside is that it requires tar to be available on the system it is run on. Still, it is better than puppet-stringing tar yourself.

like image 38
Chas. Owens Avatar answered Sep 24 '22 19:09

Chas. Owens