I am trying to insert Large Binary data into postgresql using psycopg2. I understand bytea datatype is more common to use but testing BLOB for any future use cases.
Versions of postgresql and psycopg2 is below.
pip list | grep psycopg2
psycopg2 (2.5.1)
rpm -qa | grep postgres
postgresql-server-9.2.15-1.el7_2.x86_64
I use python 2.7.5
python -V
Python 2.7.5
Below is my code snippet
file = "/home/test/jefferson_love_memorial_514993.jpg"
with open(file,"r") as fd:
try:
# First connect to postgresql server
conn = psycopg2.connect("dbname='sample' user='sample' host='10.1.0.19' password='sample'")
# Initate the session with postgresql to write large object instance
lobj = conn.lobject(0,'r',0)
# Write the data to database
lobj.write(fd.read())
except (psycopg2.Warning, psycopg2.Error) as e:
print "Exception: {}".format(e)
However, after I execute the code I get no error but nothing is inserted into the table.
-bash-4.2$ psql -d sample
psql (9.2.15)
Type "help" for help.
sample=# SELECT * FROM pg_largeobject_metadata;
lomowner | lomacl
----------+--------
(0 rows)
sample=# SELECT * FROM pg_largeobject;
loid | pageno | data
------+--------+------
(0 rows)
May I ask what is lacking in my code?
I found the reason.
I have forgotten to do conn.commit() after lobj.write(). After doing commit it works perfectly.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With