We're developing a website which requires database access. Accessing such a page works fine; accessing multiple in a row is also fine. However, if you wait for a long amount of time (15 minutes seems to be enough), accessing another page will hang for a long time (10-20 minutes has been observed). Afterwards, this error will print.
Here's the relevant code:
if __name__ == "__main__":
conf = load_conf(sys.argv[1])
engine = create_engine('postgresql://%s:%s@%s:%s/%s' %
(conf['db']['user'], conf['db']['pw'], conf['db']['address'],
conf['db']['port'], conf['db']['database']), echo=False)
Session = sessionmaker(bind=engine)
session = Session()
app = make_app(session, conf)
app.listen(8888)
tornado.ioloop.IOLoop.current().start()
The database is on a different server. My personal machine is in the Netherlands, while the database is in a server in Germany. My partner's personal machine is in Italy.
Most notably, this issue is only present on my machine, running Arch Linux. We've tested this on two other machines, running Windows and some other Linux (I assume Ubuntu, can check if needed). At this point, we have no clue on how to continue debugging.
Of course, I will provide any extra needed information on request.
It's unclear where this 15-minute timeout is coming from, although as other commenters have indicated it's likely coming from something in the network between your computer and the server. However, wherever it is coming from, there are a couple of options to work around it in SQLAlchemy.
pool_pre_ping=True
option will issue a simple test query before any attempt to reuse a connection, allowing it to detect this problem and reconnect transparently (at a small performance cost)pool_recycle=600
option tells sqlalchemy not to reuse a connection that has been idle for longer than 10 minutes. This is a more efficient solution to the problem since it doesn't add any new queries, but it requires you to work out the best recycle timeout to use. If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With