Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

How to overwrite the existing files using hadoop fs -copyToLocal command

Tags:

hadoop

Is there any way we can overwrite existing files, while coping from HDFS using:

hadoop fs -copyToLocal <HDFS PATH> <local path> 
like image 414
hjamali52 Avatar asked May 08 '13 09:05

hjamali52


People also ask

How do I overwrite a file in Hadoop?

We can write the command with –f option to overwrite the file if it is already present.

Can we overwrite a file in HDFS?

@user007, You can delete files and create a new file with the same name, overwrite. And in some cases you can append data at the end of the file but only at the end. Append is only available in hadoop version that include it and it is required for HBase and other framworks.

How do I delete old files in HDFS?

rm: Remove a file from HDFS, similar to Unix rm command. This command does not delete directories. For recursive delete, use command -rm -r .

What is fs in Hadoop command?

Hadoop file system (fs) shell commands are used to perform various file operations such as copying a file, viewing the contents of the file, changing ownership of files, changing permissions, creating directories etc.


1 Answers

fs -copyFromLocal -f $LOCAL_MOUNT_SRC_PATH/yourfilename.txt your_hdfs_file-path 

So -f option does the trick for you.

It also works for -copyToLocal as well.

like image 162
Arijit Sen Avatar answered Sep 22 '22 03:09

Arijit Sen