I'm planning to use PostgreSQL as my task meta info provider, so I want to run a few queries and get some data and pass it like a filled variable to another task. The problem is when I use PostgresHook I get the data but its in a python method that I cant access, in fact I see bellow line
[2021-08-23 13:00:12,628] {python.py:151} INFO - Done. Returned value was: [[1, "inf_account",....]]
here is part of my code:
def _query_postgres(**context):
"""
Queries Postgres and returns a cursor to the results.
"""
postgres = PostgresHook(postgres_conn_id="aramis_postgres_connection")
conn = postgres.get_conn()
cursor = conn.cursor()
mark_williams = cursor.execute(" SELECT * FROM public.aramis_meta_task; ")
# iterate over to get a list of dicts
details_dicts = [doc for doc in cursor]
# serialize to json string
details_json_string = json.dumps(details_dicts, default=json_util.default)
task_instance = context['task_instance']
task_instance.xcom_push(key="my_value", value=details_json_string)
return details_json_string
but I don't know which variable should I use to access it or how to push it to XCOM so that i can use that returned value as a parameter to another bashoperator task(Spark for example).
PostgresOperator on the other hand, only returns None
as result.
The PostgresOperator does not return any values, so unfortunately you can't use that to pass around data. You'll have to implement your own operator, for which you can indeed use the PostgresHook.
There are a few things to note about your code:
xcom_push()
, but returning a value also automatically pushes to XCom, so you'll have your output stored in XCom twice.xcom_pull()
, more details here: https://airflow.apache.org/docs/apache-airflow/stable/concepts/xcoms.htmlcursor.fetchall()
, I have a similar example here (it writes output to local disk): https://github.com/godatadriven/airflow-testing-examples/blob/master/src/testing_examples/operators/postgres_to_local_operator.py#L35-L41If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With