Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Is it possible to run hadoop fs -getmerge in S3?

I have an Elastic Map Reduce job which is writing some files in S3 and I want to concatenate all the files to produce a unique text file.

Currently I'm manually copying the folder with all the files to our HDFS (hadoop fs copyFromLocal), then I'm running hadoop fs -getmerge and hadoop fs copyToLocal to obtain the file.

is there anyway to use hadoop fs directly on S3?

like image 593
yeforriak Avatar asked Nov 04 '22 21:11

yeforriak


2 Answers

Actually, this response about getmerge is incorrect. getmerge expects a local destination and will not work with S3. It throws an IOException if you try and responds with -getmerge: Wrong FS:.

Usage:

hadoop fs [generic options] -getmerge [-nl] <src> <localdst>
like image 62
Brent Black Avatar answered Nov 08 '22 04:11

Brent Black


An easy way (if you are generating a small file that fits on the master machine) is to do the following:

  1. Merge the file parts into a single file onto the local machine (Documentation)

    hadoop fs -getmerge hdfs://[FILE] [LOCAL FILE]
    
  2. Copy the result file to S3, and then delete the local file (Documentation)

    hadoop dfs -moveFromLocal [LOCAL FILE] s3n://bucket/key/of/file
    
like image 27
justderb Avatar answered Nov 08 '22 04:11

justderb