Whenever I find some Scala / Spark code online, I want to directly paste it into spark-shell to try it out. (I am using spark-shell with Spark 1.6 on both CentOS and Mac OS.)
Generally, this approach works well, but I always have problems when lines start with a dot / period (indicating a continuing method call). If I move the dot to the previous line, it works.
Example: Here is some code I found online:
val paramMap = ParamMap(lr.maxIter -> 20)
.put(lr.maxIter, 30)
.put(lr.regParam -> 0.1, lr.threshold -> 0.55)
So when I paste this directly into spark-shell, I see this error:
scala> val paramMap = ParamMap(lr.maxIter -> 20)
paramMap: org.apache.spark.ml.param.ParamMap =
{
logreg_d63b85553548-maxIter: 20
}
scala> .put(lr.maxIter, 30)
<console>:1: error: illegal start of definition
.put(lr.maxIter, 30)
^
scala> .put(lr.regParam -> 0.1, lr.threshold -> 0.55)
<console>:1: error: illegal start of definition
.put(lr.regParam -> 0.1, lr.threshold -> 0.55)
^
However, when I instead move the dot to the previous line, everything is ok.
scala> val paramMap = ParamMap(lr.maxIter -> 20).
| put(lr.maxIter, 30).
| put(lr.regParam -> 0.1, lr.threshold -> 0.55)
paramMap: org.apache.spark.ml.param.ParamMap =
{
logreg_d63b85553548-maxIter: 30,
logreg_d63b85553548-regParam: 0.1,
logreg_d63b85553548-threshold: 0.55
}
Is there a way to configure spark-shell so that it will accept lines that start with a dot (or equivalently, so that it will continue lines even if they don't end in a dot)?
There must be no leading whitespace.
scala> "3"
res0: String = 3
scala> .toInt
res1: Int = 3
scala> "3"
res2: String = 3
scala> .toInt
<console>:1: error: illegal start of definition
.toInt
^
PS: Maybe it should ignore whitespace when a dot is detected. A JIRA was added on that concern here.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With