Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

read files recursively from sub directories with spark from s3 or local filesystem

I am trying to read files from a directory which contains many sub directories. The data is in S3 and I am trying to do this:

val rdd =sc.newAPIHadoopFile(data_loc,
    classOf[org.apache.hadoop.mapreduce.lib.input.TextInputFormat],
    classOf[org.apache.hadoop.mapreduce.lib.input.TextInputFormat],
    classOf[org.apache.hadoop.io.NullWritable])

this does not seem to work.

Appreciate the help

like image 279
venuktan Avatar asked Jan 13 '15 02:01

venuktan


1 Answers

yes it works, took a while to get the individual blocks/splits though , basically a specific directory in every sub directory : s3n://bucket/root_dir/*/data/*/*/*

like image 117
venuktan Avatar answered Sep 17 '22 22:09

venuktan