is there any way to list only directories and their directories recursively with the hadoop command line?
I was wondering if there is some kind of command similar to the unix command:
find /tmp -type d -print
The following arguments are available with hadoop ls command: Usage: hadoop fs -ls [-d] [-h] [-R] [-t] [-S] [-r] [-u] <args> Options: -d: Directories are listed as plain files. -h: Format file sizes in a human-readable fashion (eg 64.0m instead of 67108864). -R: Recursively list subdirectories encountered.
To browse the HDFS file system in the HDFS NameNode UI, select Utilities > Browse the file system . The Browse Directory page is populated. Enter the directory path and click Go!.
Use the hdfs dfs -ls command to list files in Hadoop archives. Run the hdfs dfs -ls command by specifying the archive directory location.
Using the ls command, we can check for the directories in HDFS. Hadoop HDFS mkdir Command Description: This command creates the directory in HDFS if it does not already exist. Note: If the directory already exists in HDFS, then we will get an error message that file already exists.
Try this as well: (listing directories from root)
hadoop fs -ls -R / | grep "^d"
Try using this command
hdfs dfs -ls hdfs:/
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With