I'm having problem with libpq's PQexec
function hanging on intermittent
connections. After looking around the mailing list, the solution is to use the
asynchronous functions PQsendQuery
/PQgetResult
and implement your own timeout.
Now the
issue I'm facing is that PQgetResult
needs to be called multiple times until
it returns null
and then you know it's done. However, the rest of my
application expects a single PQresult
object per query.
So my question is:
PQresult
s?PQisBusy
& PQconsumeInput
to wait until all the
results are ready before calling PQgetResult
?credits to Laurenz Albe to who answered this over on the postgresql mailing list.
If you have a single SQL statement, you will get only one
PQresult
. You get more than one if you send a query string
with more than one statement, e.g.
PQsendQuery(conn, "SELECT 42; SELECT 'Hello'");
would result in two PQresults
.
You can get multiple PQresults
only using asynchronous
command processing; the corresponding PQexec would return
only the PQresult of the last statement executed.
So you can get the same behaviour as PQexec
by discarding
all PQresults
except for the last one.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With