what's the difference beetween the download packages type of spark : 1)pre-built for hadoop 2-6-0 and later and 2)Source code(can build several hadoop versions) can i insatll a pre-built for hadoop 2-6-0 and later but i work without using (hadoop , hdfs , hbase)
ps :hadoop 2.6.0 is already installed on my machine .
Last answer only addressed Q1, so writing this. Answer to your Q2 is Yes, you can work on spark without hadoop components installed, even if you use Spark prebuilt with specific hadoop version. Spark will throw bunch of errors while starting up master/workers, which you (and spark) can blissfully ignore as long as you see them up and running. In terms of applications, its never a problem.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With