Is it possible to pass a variable from Spark interpreter (pyspark or sql) to Markdown? The requirement is to display a nicely formatted text (i.e. Markdown) such as "20 events occurred between 2017-01-01 and 2017-01-08" where the 20, 2017-01-01 and 2017-01-08 are dynamically populated based on output from other paragraphs.
Posting this for benefit of other users, this is what I have been able to find:
(First paragraph)
%spark
// create data frame
val eventLogDF = ...
// register temp table for SQL access
eventLogDF.registerTempTable( "eventlog" )
val query = sql( "select max(Date), min(Date), count(*) from eventlog" ).take(1)(0)
val maxDate = query(0).toString()
val minDate = query(1).toString()
val evCount = query(2).toString()
// bind variables which can be accessed from angular interpreter
z.angularBind( "maxDate", maxDate )
z.angularBind( "minDate", minDate )
z.angularBind( "evCount", evCount )
(Second paragaph)
%angular
<div>There were <b>{{evCount}} events</b> between <b>{{minDate}}</b> and <b>{{maxDate}}</b>.</div>
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With