I would like to edit a text file directly in HDFS using VI without having to copy it to local, edit it and then copy it back from local. Is this possible?
Edit: This used to be possible in Cloudera's Hue UI but is no longer the case.
You can not modified data once stored in hdfs because hdfs follows Write Once Read Many model. You can only append the data once stored in hdfs.
You can use the Hadoop filesystem command to read any file. It supports the cat command to read the content.
In HDFS we cannot edit the files which are already stored in HDFS, but we can append data by reopening the files.
There are couple of options that you could try, which allows you to mount HDFS to your local machine and then you could use your local system commands like cp, rm, cat, mv, mkdir, rmdir, more, etc. But neither of them supports random write operations but supports append operations.
NFS Gateway uses NFS V3 and support appending to file but could not perform random write operations.
And regarding your comment on hue, maybe Hue is downloading the file to a local buffer and after editing it might be replacing the original file in HDFS.
A simple way is to copy from and to hdfs, and edit locally (See here)
hvim <filename>
Source code of hvim
hadoop fs -text $1>hvim.txt
vim hvim.txt
hadoop fs -rm -skipTrash $1
hadoop fs -copyFromLocal hvim.txt $1
rm hvim.txt
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With