Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Livy Server: return a dataframe as JSON?

I am executing a statement in Livy Server using HTTP POST call to localhost:8998/sessions/0/statements, with the following body

{
  "code": "spark.sql(\"select * from test_table limit 10\")"
}

I would like an answer in the following format

(...)
"data": {
  "application/json": "[
    {"id": "123", "init_date": 1481649345, ...},
    {"id": "133", "init_date": 1481649333, ...},
    {"id": "155", "init_date": 1481642153, ...},
  ]"
}
(...)

but what I'm getting is

(...)
"data": {
  "text/plain": "res0: org.apache.spark.sql.DataFrame = [id: string, init_date: timestamp ... 64 more fields]"
}
(...)

Which is the toString() version of the dataframe.

Is there some way to return a dataframe as JSON using the Livy Server?

EDIT

Found a JIRA issue that addresses the problem: https://issues.cloudera.org/browse/LIVY-72

By the comments one can say that Livy does not and will not support such feature?

like image 730
matheusr Avatar asked Dec 13 '16 17:12

matheusr


2 Answers

I recommend using the built-in (albeit hard to find documentation for) magics %json and %table:

%json

session_url = host + "/sessions/1"
statements_url = session_url + '/statements'
data = {
        'code': textwrap.dedent("""\
        val d = spark.sql("SELECT COUNT(DISTINCT food_item) FROM food_item_tbl")
        val e = d.collect
        %json e
        """)}
r = requests.post(statements_url, data=json.dumps(data), headers=headers)
print r.json()

%table

session_url = host + "/sessions/21"
statements_url = session_url + '/statements'
data = {
        'code': textwrap.dedent("""\
        val x = List((1, "a", 0.12), (3, "b", 0.63))
        %table x
        """)}
r = requests.post(statements_url, data=json.dumps(data), headers=headers)
print r.json()

Related: Apache Livy: query Spark SQL via REST: possible?

like image 154
Garren S Avatar answered Nov 01 '22 14:11

Garren S


I don't have a lot of experience with Livy, but as far as I know this endpoint is used as an interactive shell and the output will be a string with the actual result that would be shown by a shell. So, with that in mind, I can think of a way to emulate the result you want, but It may not be the best way to do it:

{
  "code": "println(spark.sql(\"select * from test_table limit 10\").toJSON.collect.mkString(\"[\", \",\", \"]\"))"
}

Then, you will have a JSON wrapped in a string, so your client could parse it.

like image 26
Daniel de Paula Avatar answered Nov 01 '22 16:11

Daniel de Paula