I want to copy my directory structure excluding the files. Is there any option in the tar to ignore all files and copy only the Directories recursively.
Tape Archive or tar is a file format for creating files and directories into an archive while preserving filesystem information such as permissions. We can use the tar command to create tar archives, extract the archives, view files and directories stored in the archives, and append files to an existing archive.
One way is to use the -C option of the tar command. We can envisage the operation of the tar command with the -C option as two steps: Change the current directory to the one given by the -C option. Archive the specified files in the new current directory.
To change the working directory in the middle of a list of file names, either on the command line or in a file specified using ' --files-from ' (' -T '), use ' --directory ' (' -C '). This will change the working directory to the specified directory after that point in the list.
It only archives the names of the entries in the directory (including subdirectories!), but it doesn't archive any files. But while it archives files only, it also archives the names of the subdirectories. Is there a way to tell tar files only, no directories? Show activity on this post.
The Linux tar command is used for saving several files into an archive file. We can later extract all of the files or just the desired ones in the archive file. tar stands for “tape archive”.
There are several parts to the tar command when you are creating a tarball from a directory. Below is an example tar command: [2] tar - This invokes the tar archiving program. c - This flag signals the "creation" of the .tar file. It should always come first. v - This indicates that the process is "verbose".
When you want to use find with tar, the best way is to use cpio instead of tar. cpio can write tar archives and is designed to take the list of files to archive from stdin. Using find and cpio is a more unix-y approach in that you let find do the file selection with all the power that it has, and let cpio do the archiving.
You can use find to get the directories and then tar them:
find .. -type d -print0 | xargs -0 tar cf dirstructure.tar --no-recursion
If you have more than about 10000 directories use the following to work around xargs limits:
find . -type d -print0 | tar cf dirstructure.tar --no-recursion --null --files-from -
Directory names that contain spaces or other special characters may require extra attention. For example:
$ mkdir -p "backup/My Documents/stuff"
$ find backup/ -type d | xargs tar cf directory-structure.tar --no-recursion
tar: backup/My: Cannot stat: No such file or directory
tar: Documents: Cannot stat: No such file or directory
tar: backup/My: Cannot stat: No such file or directory
tar: Documents/stuff: Cannot stat: No such file or directory
tar: Exiting with failure status due to previous errors
Here are some variations to handle these cases of "unusual" directory names:
$ find backup/ -type d -print0 | xargs -0 tar cf directory-structure.tar --no-recursion
Using -print0 with find will emit filenames as null-terminated strings; with -0 xargs will interpret arguments that same way. Using null as a terminator helps ensure that even filenames with spaces and newlines will be interpreted correctly.
It's also possible to pipe results straight from find
to tar
:
$ find backup/ -type d | tar cf directory-structure.tar -T - --no-recursion
Invoking tar with -T - (or --files-from -) will cause it to read filenames from stdin, expecting each filename to be separated by a line break.
For maximum effect this can be combined with options for null-terminated strings:
$ find . -type d -print0 | tar cf directory-structure.tar --null --files-from - --no-recursion
Of these I consider this last version to be the most robust, because it supports both unusual filenames and (unlike xargs) is not inherently limited by system command-line sizes. (see xargs --show-limits
)
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With