Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Hadoop distcp to S3 behind HTTP proxy

I'm trying to use distcp to copy some files from HDFS to Amazon s3. My Hadoop cluster connects to the internet through an HTTP proxy, but I can't figure out how to specify this when connecting to s3. I'm currently getting the issue:

httpclient.HttpMethodDirector: I/O exception (org.apache.commons.httpclient.ConnectTimeoutException) caught when processing request: The host did not accept the connection within timeout of 60000 ms

This indicates that it's trying to connect directly to amazon. How do I get distcp to use the proxy host?

like image 693
growse Avatar asked Nov 11 '22 17:11

growse


1 Answers

I post another answer here because it's the first SOW question that comes up in Google when asking for hdfs s3 proxy and the existing answer is not the best according to me.

Configuring S3 for HDFS is best done on the hdfs-site.xml file on each nodes. By this it works for distcp (to copy from HDFS to S3 and the opposite) but also with Impala and potentially other Hadoop components that can use S3.

So, add the following properties to your hdfs-site.xml :

<property>
      <name>fs.s3a.access.key</name>
      <value>your_access_key</value>
    </property>
    <property>
      <name>fs.s3a.secret.key</name>
      <value>your_secret_key</value>
    </property>
    <property>
      <name>fs.s3a.proxy.host</name>
      <value>your_proxy_host</value>
    </property>
    <property>
      <name>fs.s3a.proxy.port</name>
      <value>your_proxy_port</value>
    </property>
like image 199
loicmathieu Avatar answered Nov 15 '22 07:11

loicmathieu