I am trying to run the simple following code using spark within Eclipse:
import org.apache.spark.sql.SQLContext
import org.apache.spark.SparkConf
import org.apache.spark.SparkContext
object jsonreader {
def main(args: Array[String]): Unit = {
println("Hello, world!")
val conf = new SparkConf()
.setAppName("TestJsonReader")
.setMaster("local")
.set("spark.driver.memory", "3g")
val sc = new SparkContext(conf)
val sqlContext = new SQLContext(sc)
val df = sqlContext.read.format("json").load("text.json")
df.printSchema()
df.show
}
}
However, I get the following errors:
16/08/18 18:05:28 ERROR SparkContext: Error initializing SparkContext.
java.lang.IllegalArgumentException: System memory 259522560 must be at least 471859200. Please increase heap size using the --driver-memory option or spark.driver.memory in Spark configuration.
I followed different tutorials like this one: How to set Apache Spark Executor memory. Most of time either I use --driver-memory
option (not possible with Eclipse) or by modifiying the spark configuration but there is no corresponding file.
Does anyone have any idea about how to solve this issue within Eclipse environment?
On the Eclipse menu, clicks Run -> Run Configurations.. , select the Java application we want to run, click on the Arguments tab, VM arguments section, and adjust a better Java initial maximum heap size.
You can resolve it by setting the partition size: increase the value of spark. sql. shuffle. partitions.
Now, the maximum Java heap space for eclipse is set to 512 megabytes. If you still see OutOfMemoryError in Eclipse, you probably need to investigate more which plugin/feature or project is creating the problem. Disabling some plugins or projects will certainly help to free some memory on Eclipse.
Measuring the memory usage of eclipseGoto Window > Preferences > General and enable Show heap status and click OK.
In Eclipse go to Run > Run Configurations... > Arguments > VM arguments and set max heapsize like -Xmx512m
.
I had this issue as well and this is how I solved it. Thought it might be helpful.
val conf: SparkConf = new SparkConf().setMaster("local[4]").setAppName("TestJsonReader").set("spark.driver.host", "localhost")
conf.set("spark.testing.memory", "2147480000")
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With