I would like to run spark-shell with a external package behind a corporate proxy. Unfortunately external packages passed via --packages
option are not resolved.
E.g., when running
bin/spark-shell --packages datastax:spark-cassandra-connector:1.5.0-s_2.10
the cassandra connector package is not resolved (stuck at last line):
Ivy Default Cache set to: /root/.ivy2/cache
The jars for the packages stored in: /root/.ivy2/jars
:: loading settings :: url = jar:file:/opt/spark/lib/spark-assembly-1.6.1-hadoop2.6.0.jar!/org/apache/ivy/core/settings/ivysettings.xml
datastax#spark-cassandra-connector added as a dependency
:: resolving dependencies :: org.apache.spark#spark-submit-parent;1.0
confs: [default]
After some time the connection times out containing error messages like this:
:::: ERRORS
Server access error at url https://repo1.maven.org/maven2/datastax/spark-cassandra-connector/1.5.0-s_2.10/spark-cassandra-connector-1.5.0-s_2.10.pom (java.net.ConnectException: Connection timed out)
When i deactivate the VPN with the corporate proxy the package gets resolved and downloaded immediately.
What i tried so far:
Exposing proxies as environment variables:
export http_proxy=<proxyHost>:<proxyPort>
export https_proxy=<proxyHost>:<proxyPort>
export JAVA_OPTS="-Dhttp.proxyHost=<proxyHost> -Dhttp.proxyPort=<proxyPort>"
export ANT_OPTS="-Dhttp.proxyHost=<proxyHost> -Dhttp.proxyPort=<proxyPort>"
Running spark-shell with extra java options:
bin/spark-shell --conf "spark.driver.extraJavaOptions=-Dhttp.proxyHost=<proxyHost> -Dhttp.proxyPort=<proxyPort>" --conf "spark.executor.extraJavaOptions=-Dhttp.proxyHost=<proxyHost> -Dhttp.proxyPort=<proxyPort>" --packages datastax:spark-cassandra-connector:1.6.0-M1-s_2.10
Is there some other configuration possibility i am missing?
Found the correct settings:
bin/spark-shell --conf "spark.driver.extraJavaOptions=-Dhttp.proxyHost=<proxyHost> -Dhttp.proxyPort=<proxyPort> -Dhttps.proxyHost=<proxyHost> -Dhttps.proxyPort=<proxyPort>" --packages <somePackage>
Both http and https proxies have to be set as extra driver options. JAVA_OPTS does not seem to do anything.
If proxy is correctly configured on your OS, you can use the java property: java.net.useSystemProxies
:
--conf "spark.driver.extraJavaOptions=-Djava.net.useSystemProxies=true"
so proxy host / port and no-proxy hosts will be configured.
This worked for me in spark 1.6.1:
bin\spark-shell --driver-java-options "-Dhttp.proxyHost=<proxyHost> -Dhttp.proxyPort=<proxyPort> -Dhttps.proxyHost=<proxyHost> -Dhttps.proxyPort=<proxyPort>" --packages <package>
Was struggling with pyspark till I found this:
Adding on to @Tao Huang's answer:
bin/pyspark --driver-java-options="-Dhttp.proxyUser=user -Dhttp.proxyPassword=password -Dhttps.proxyUser=user -Dhttps.proxyPassword=password -Dhttp.proxyHost=proxy -Dhttp.proxyPort=port -Dhttps.proxyHost=proxy -Dhttps.proxyPort=port" --packages [groupId:artifactId]
I.e. should be -Dhttp(s).proxyUser instead of ...proxyUsername
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With