Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Get list of files from hdfs (hadoop) directory using python script

How to get a list of files from hdfs (hadoop) directory using python script?

I have tried with following line:

dir = sc.textFile("hdfs://127.0.0.1:1900/directory").collect()

The directory have list of files "file1,file2,file3....fileN". By using the line i got all the content list only. But i need to get list of file names.

Can anyone please help me to find out this problem?

Thanks in advance.

like image 504
sara Avatar asked Nov 08 '22 23:11

sara


2 Answers

Use subprocess

import subprocess
p = subprocess.Popen("hdfs dfs -ls <HDFS Location> |  awk '{print $8}'",
    shell=True,
    stdout=subprocess.PIPE,
    stderr=subprocess.STDOUT)

for line in p.stdout.readlines():
    print line

EDIT: Answer without python. The first option can be used to recursively print all the sub-directories as well. The last redirect statement can be omitted or changed based on your requirement.

hdfs dfs -ls -R <HDFS LOCATION> | awk '{print $8}' > output.txt
hdfs dfs -ls <HDFS LOCATION> | awk '{print $8}' > output.txt

EDIT: Correcting a missing quote in awk command.

like image 167
Rahul Kadukar Avatar answered Nov 15 '22 04:11

Rahul Kadukar


import subprocess

path = "/data"
args = "hdfs dfs -ls "+path+" | awk '{print $8}'"
proc = subprocess.Popen(args, stdout=subprocess.PIPE, stderr=subprocess.PIPE, shell=True)

s_output, s_err = proc.communicate()
all_dart_dirs = s_output.split() #stores list of files and sub-directories in 'path'
like image 23
Sanchari Dan Avatar answered Nov 15 '22 04:11

Sanchari Dan