Is there any way we can overwrite existing files, while coping from HDFS
using:
hadoop fs -copyToLocal <HDFS PATH> <local path>
We can write the command with –f option to overwrite the file if it is already present.
@user007, You can delete files and create a new file with the same name, overwrite. And in some cases you can append data at the end of the file but only at the end. Append is only available in hadoop version that include it and it is required for HBase and other framworks.
rm: Remove a file from HDFS, similar to Unix rm command. This command does not delete directories. For recursive delete, use command -rm -r .
Hadoop file system (fs) shell commands are used to perform various file operations such as copying a file, viewing the contents of the file, changing ownership of files, changing permissions, creating directories etc.
fs -copyFromLocal -f $LOCAL_MOUNT_SRC_PATH/yourfilename.txt your_hdfs_file-path
So -f
option does the trick for you.
It also works for -copyToLocal
as well.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With