Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Reading a simple Avro file from HDFS

Tags:

java

io

avro

I am trying to do a simple read of an Avro file stored in HDFS. I found out how to read it when it is on the local file system....

FileReader reader = DataFileReader.openReader(new File(filename), new GenericDatumReader());

for (GenericRecord datum : fileReader) {
   String value = datum.get(1).toString();
   System.out.println("value = " value);
}

reader.close();

My file is in HDFS, however. I cannot give the openReader a Path or an FSDataInputStream. How can I simply read an Avro file in HDFS?

EDIT: I got this to work by creating a custom class (SeekableHadoopInput) that implements SeekableInput. I "stole" this from "Ganglion" on github. Still, seems like there would be a Hadoop/Avro integration path for this.

Thanks

like image 899
Wanderer Avatar asked Jul 24 '12 13:07

Wanderer


1 Answers

The FsInput class (in the avro-mapred submodule, since it depends on Hadoop) can do this. It provides the seekable input stream that is needed for Avro data files.

Path path = new Path("/path/on/hdfs");
Configuration config = new Configuration(); // make this your Hadoop env config
SeekableInput input = new FsInput(path, config);
DatumReader<GenericRecord> reader = new GenericDatumReader<GenericRecord>();
FileReader<GenericRecord> fileReader = DataFileReader.openReader(input, reader);

for (GenericRecord datum : fileReader) {
    System.out.println("value = " + datum);
}

fileReader.close(); // also closes underlying FsInput
like image 161
Martin Kleppmann Avatar answered Oct 20 '22 10:10

Martin Kleppmann