I installed spark and when trying to run it, I am getting the error: WARN ProcfsMetricsGetter: Exception when trying to compute pagesize, as a result reporting of ProcessTree metrics is stopped
Can someone help me with that?
I received this same message, running Spark 3.0.1 on Windows 10, using Scala 2.12.10. It's not actually an Error, in the sense that it ends your program execution. Its a warning related to /proc file systems on Linux machines.
If you are also on a Windows machine, the answer maybe, to quote Wing Yew Poon @ Apache: "The warning happened because the command "getconf PAGESIZE" was run and it is not a valid command on Windows so an exception was caught." (From the Spark jira issue here).
If your program failed right after throwing this Exception message, it is for some other reason. In my case, Spark was crashing with this message right after this warning:
20/11/13 12:41:51 ERROR MicroBatchExecution: Query [id = 32320bc7-d7ba-49b4-8a56-1166a4f2d6db, runId = d7cc93c2-41ef-4765-aecd-9cd453c25905] terminated with error
org.apache.spark.SparkException: Job 1 cancelled because SparkContext was shut down
This warning can be hidden by setting spark.executor.processTreeMetrics.enabled to false. To quote Mr. Poon again, "it is a minor bug that you see this warning. But it can be safely ignored."
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With