Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Python MySQL queries time out where MySQL workbench works fine

I recently had my website moved to a new server. I have some basic python scripts with access data in a MySQL database. On the old server we had no problems. On the new server :

  • MySQLWorkbench can connect no trouble and perform all queries
  • The same (SELECT) queries with python work 5% of the time and the other 95% of the time they timeout or the connection is lost during the query
  • for example the table has 100000 rows, selecting the entire thing in MySQLWorkbench works fine. Returned in 3 seconds.
  • in python the same query never works, when LIMIT 2999 the query works but just LIMIT 3010 causes it to timeout.
  • same effect observed when the script is run locally or remotely

Been digging around for a few days now to figure out if there are some settings in either the database, the database server, the server itself that prevent python (but not MySQLWorkbench) doing its job correctly.

The Query and code in case they are interesting:

query = "SELECT * FROM wordpress.table;"

conn = MySQLConnection(**mysqlconfig)
cursor = conn.cursor()
cursor.execute(query)
rows = cursor.fetchall()

I don't have the details on the server but it has enough power for MySQLWorkbench to work fine, just python can't seem to be able to made to work

**** EDIT ****

To see if this problem was caused by queries returning too much data for python to handle I thought of using OFFSET and LIMIT to loop through a bigger query in pieces with say 10 rows per query.

total_rows = 100000
interval = 10
data = []

for n in range(0, total_rows / interval):
    q = "SELECT * FROM wordpress.table LIMIT %s OFFSET %s" % (interval, n * interval)
    cursor.execute(q)
    returned = cursor.fetchall()
    rows = [[i for i in j] for j in list(returned)]

    for row in rows:
        data.append(row)

    print n, len(data)

Expected: this would quickly work through the bigger query in smaller pieces Happens: It gets further than the 3000 rows it got stuck on before but ends up hitting a wall after some iterations. Also not consistently, running the script 10 times results in n reaching a different point each time.

like image 948
CharlieSmith Avatar asked Oct 30 '22 22:10

CharlieSmith


1 Answers

You might get better performance using a server side cursor:

import MySQLdb.cursors

con = MySQLdb.connect(host=host,
                  user=user,
                  passwd=pwd,
                  charset=charset,
                  port=port,
                  cursorclass=MySQLdb.cursors.SSDictCursor);
cur = con.cursor()
cur.execute("SELECT * FROM wordpress.table")
for row in cur:
    print row

You might to check also the replies to those questions:

How to get a row-by-row MySQL ResultSet in python

How to efficiently use MySQLDB SScursor?

like image 135
Thorsten Avatar answered Nov 15 '22 06:11

Thorsten