Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

how to write Spark data frame to Neo4j database

I'd like to build this workflow:

  • preprocess some data with Spark, ending with a data frame
  • write such dataframe to Neo4j as a set of nodes

My idea is really basic: write each row in the df as a node, where each column value represents the value of the node's attribute

I have seen many articles, including neo4j-spark-connector and Introducing the Neo4j 3.0 Apache Spark Connector but they all focus on importing into Spark data from a Neo4j db... so far, I wasn't able to find a clear example of writing a Spark data frame to a Neo4j database.

Any pointer to documentation or very basic examples are much appreciated.

like image 462
user299791 Avatar asked Oct 27 '25 10:10

user299791


2 Answers

Read this issue to answer my question.

Long story short, neo4j-spark-connector can write Spark data to Neo4j db, and yes, there is a lack in the documentation of the new release.

like image 120
user299791 Avatar answered Oct 30 '25 01:10

user299791


you can write some routine and use an opensource neo4j java driver

https://github.com/neo4j/neo4j-java-driver

for example.

Simple serialise the result of an RDD (using rdd.toJson) and then use the above driver to create your neo4j nodes and push into your neo4j instance.

like image 25
andrew.butkus Avatar answered Oct 29 '25 23:10

andrew.butkus



Donate For Us

If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!