Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

HDFS error: target already exists

Tags:

hadoop

I am very new to Hadoop. When I try to execute this command it says that the target already exists. How do I remove this file from hadoop? Is that the same as removing the target?

me$ hdfs -copyFromLocal myfile.txt input/myfile.txt

copyFromLocal: Target input/myfile.txt already exists
like image 308
bernie2436 Avatar asked Nov 09 '13 18:11

bernie2436


People also ask

What's on the errors tab in DFS?

What's on the Errors tab. Normally this error means there is either a permission issue or the DFS service isn't running. These folders do not currently exist on the server. However, this server had some issues and had to essentially be rebuilt. Demoted and repromoted (I am aware of best practice to NOT have DFS on a DC).

What is stuck DFS metadata?

Stuck DFS metadata is a PIA to track down and clean up. I did it on the primary domain controller hoping AD replication would take care of it on the others. I cleaned metadata related to the Domain controller that had to be force removed.

What to do if folder does not exist in AD?

If folder does not exist, please see if it still in AD: a. Open ADSIedit.msc. b. Connect to Default Naming Context (the domain name) Check if any removed information are still there. Delete them and try to recreate the namespace again. If you have any feedback on our support, please send to [email protected].


1 Answers

You don't have to remove the file first and then copy the new one. You can do it in one step by using the -f option with -copyFromLocal

hadoop fs -copyFromLocal -f myfile.txt input/myfile.txt
like image 188
Charity Leschinski Avatar answered Nov 10 '22 03:11

Charity Leschinski