I am using
hdfs dfs -put myfile mypath
and for some files I get
put: 'myfile': File Exists
Thanks!
A file with the same name exists at the location you're trying to write to. You can overwrite by specifying the -f flag.
It is used when we are dealing with different file systems such as Local FS, HDFS etc. It is used when we are dealing for operations related to HDFS. This command should not be used, as it is deprecated. Even if you use it, it will send the command to hdfs dfs.
fs is used for generic file system and it can point to any file system such as local file system, HDFS, WebHDFS, S3 FS, etc. dfs points to the Distributed File System and it is specific to HDFS. You can use it to execute operations on HDFS. Now it is deprecated, and you have to use hdfs dfs instead of hadoop dfs.
You can not modified data once stored in hdfs because hdfs follows Write Once Read Many model. You can only append the data once stored in hdfs.
put: 'myfile': File Exists
Means,the file named "myfile" already exists in hdfs. You cannot have multiple files of the same name in hdfs
You can overwrite it using hadoop fs -put -f /path_to_local /path_to_hdfs
You can overwrite your file in hdfs using -f command.For example
hadoop fs -put -f <localfile> <hdfsDir>
OR
hadoop fs -copyFromLocal -f <localfile> <hdfsDir>
It worked fine for me. However -f command won't work in case of get or copyToLocal command. check this question
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With