I have a table looks like this:
part min max unitPrice
A 1 9 10
A 10 99 5
B 1 9 11
B 10 99 6
...
I also have a production table that I need to insert the previous data into this production one. When I do the select statement from one table and fetch the record, I have a hard time insert into another table.
Say
cursor_table1.execute('select part, min, max, unitPrice, now() from table1')
for row in cursor_table1.fetchall():
part, min, max, unitPrice, now = row
print part, min, max, unitPrice, now
The result turns out to be
'416570S39677N1043', 1L, 24L, 48.5, datetime.datetime(2018, 10, 8, 16, 33, 42)
I know Python smartly figured out the type of every column but I actually just want the raw content. So I can do something like this:
cursor_table1.execute('select part, min, max, unitPrice, now() from table1')
for row in cursor_table1.fetchall():
cursor_table2.execute('insert into table2 values ' + str(tuple(row)))
The question is how can simply do a select statement from one table and add it to another.
Let me know if I did not describe my question in a clear way and I can add extra info if you want.
It might be a bit late to answer this question, but I also had the same problem and landed in this page. Now, I happen to have found a different answer and figured that it might be helpful to share it with others who have the same problem.
I have two mysql servers, one on Raspberry Pi and another on a VPS and I had to sync data between these two by reading data on RPi and inserting into the VPS. I've done it the usual way by writing a loop and catching the records one by one and inserting them and it was really slow, it took about 2 minutes for 2000 datasets.
Now I solved this problem by using the executemany function. As for the data I obtained all tuples returned by the select using the fetchall function.
rows = x.fetchall()
y.executemany("insert into table2 (f1, f2, f3) values (%s,%s,%s);", rows)
And it was super fast 😀, it took about 2 seconds for 5000 records.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With