Is it possible to write text from the command line into Hadoop?
Trying to do something similar to the unix write/append to file command.
echo "hello world" > hello_world.txt
In Hadoop land I would expect this to work, but the commands do not.
hadoop fs -appendToFile "foo bar" /dir/hadoop/hello_world.txt
hadoop fs -put "hello world" /dir/hadoop/hello_world.txt
To write a file in HDFS, a client needs to interact with master i.e. namenode (master). Now namenode provides the address of the datanodes (slaves) on which client will start writing the data. Client directly writes data on the datanodes, now datanode will create data write pipeline.
Step 1: Make a directory in HDFS where you want to copy this file with the below command. Step 2: Use copyFromLocal command as shown below to copy it to HDFS /Hadoop_File directory. Step 3: Check whether the file is copied successfully or not by moving to its directory location with below command.
In order to copy a file from the local file system to HDFS, use Hadoop fs -put or hdfs dfs -put, on put command, specify the local-file-path where you wanted to copy from and then HDFS-file-path where you wanted to copy to. If the file already exists on HDFS, you will get an error message saying “File already exists”.
You can copy (upload) a file from the local filesystem to a specific HDFS using the fs put command. The specified file or directory is copied from your local filesystem to the HDFS.
Hadoop document states appendToFile and put can read stdin
echo "hello world" | hadoop fs -appendToFile - /dir/hadoop/hello_world.txt
echo "hello world" | hadoop fs -put - /dir/hadoop/hello_world.txt
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With