"$brew install apache-spark' gets me version 2.3.x. '$brew search apache-spark' and '$brew info apache-spark' do not provide a an option to install a different version. is it possible to get a different version with homebrew?
Run these commands (assuming you have apache-spark already installed via Homebrew)
cd "$(brew --repo homebrew/core)"
git log Formula/apache-spark.rb
Eg. the 2.2.0 version:
...
commit bdf68bd79ebd16a70b7a747e027afbe5831f9cc3
Author: ilovezfs
Date: Tue Jul 11 22:19:12 2017 -0700
apache-spark 2.2.0 (#15507)
....
git checkout -b apache-spark-2.2.0 bdf68bd79ebd16a70b7a747e027afbe5831f9cc3
brew unlink apache-spark
HOMEBREW_NO_AUTO_UPDATE=1 brew install apache-spark
Cleanup
git checkout master
git branch -d apache-spark-2.2.0
Check / switch:
brew list apache-spark --versions
brew switch apache-spark 2.2.0
I had the same problem, when I install through homebrew, by default it could only find the apache-spark 2.3.0 formula and cannot find 2.2.0 at even deleted repos..
So, I have backed up the existing apache-spark.rb (version 2.3.0) from path: /usr/local/Homebrew/Library/Taps/homebrew/homebrew-core/Formula then overwritten with below:
class ApacheSpark < Formula
desc "Engine for large-scale data processing"
homepage "https://spark.apache.org/"
url "https://www.apache.org/dyn/closer.lua?path=spark/spark-2.2.0/spark-2.2.0-bin-hadoop2.7.tgz"
version "2.2.0"
sha256 "97fd2cc58e08975d9c4e4ffa8d7f8012c0ac2792bcd9945ce2a561cf937aebcc"
head "https://github.com/apache/spark.git"
bottle :unneeded
def install
# Rename beeline to distinguish it from hive's beeline
mv "bin/beeline", "bin/spark-beeline"
rm_f Dir["bin/*.cmd"]
libexec.install Dir["*"]
bin.write_exec_script Dir["#{libexec}/bin/*"]
end
test do
assert_match "Long = 1000", pipe_output(bin/"spark-shell", "sc.parallelize(1 to 1000).count()")
end
end
then followed above process to re-install which I have 2.2.0 and 2.3.0 with switch facility.
Hope it helps.
I need to install Apache Spark 2.4.0 version specifically on my MacBook. But is not available any more in the Brew listing but still you can make it.
Install latest Spark by brew install apache-spark
. Let say it installed apache-spark-3.0.1
Once completed do brew edit apache-spark
and edit the Pachecos-spark.rb as follows
class ApacheSpark < Formula
desc "Engine for large-scale data processing"
homepage "https://spark.apache.org/"
url "https://archive.apache.org/dist/spark/spark-2.4.0/spark-2.4.0-bin-hadoop2.7.tgz"
mirror "https://archive.apache.org/dist/spark/spark-2.4.0/spark-2.4.0-bin-hadoop2.7.tgz"
version "2.4.0"
sha256 "c93c096c8d64062345b26b34c85127a6848cff95a4bb829333a06b83222a5cfa"
license "Apache-2.0"
head "https://github.com/apache/spark.git"
bottle :unneeded
depends_on "openjdk@8"
def install
# Rename beeline to distinguish it from hive's beeline
mv "bin/beeline", "bin/spark-beeline"
rm_f Dir["bin/*.cmd"]
libexec.install Dir["*"]
bin.install Dir[libexec/"bin/*"]
bin.env_script_all_files(libexec/"bin", JAVA_HOME: Formula["openjdk@8"].opt_prefix)
end
test do
assert_match "Long = 1000",
pipe_output(bin/"spark-shell --conf spark.driver.bindAddress=127.0.0.1",
"sc.parallelize(1 to 1000).count()")
end
end
Now uninstall the spark again using brew uninstall apache-spark
Install it again using brew install apache-spark
Result
% spark-shell
2021-02-09 19:27:11 WARN NativeCodeLoader:62 - Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
Setting default log level to "WARN".
To adjust logging level use sc.setLogLevel(newLevel). For SparkR, use setLogLevel(newLevel).
Spark context Web UI available at http://192.168.0.17:4040
Spark context available as 'sc' (master = local[*], app id = local-1612927640472).
Spark session available as 'spark'.
Welcome to
____ __
/ __/__ ___ _____/ /__
_\ \/ _ \/ _ `/ __/ '_/
/___/ .__/\_,_/_/ /_/\_\ version 2.4.0
/_/
Using Scala version 2.11.12 (OpenJDK 64-Bit Server VM, Java 1.8.0_282)
Type in expressions to have them evaluated.
Type :help for more information.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With