Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Copy tables from one database to another in SQL Server, using Python

Does anybody know of a good Python code that can copy large number of tables (around 100 tables) from one database to another in SQL Server?

I ask if there is a way to do it in Python, because due to restrictions at my place of employment, I cannot copy tables across databases inside SQL Server alone.

Here is a simple Python code that copies one table from one database to another. I am wondering if there is a better way to write it if I want to copy 100 tables.

print('Initializing...')

import pandas as pd
import sqlalchemy
import pyodbc

db1 = sqlalchemy.create_engine("mssql+pyodbc://user:password@db_one")
db2 = sqlalchemy.create_engine("mssql+pyodbc://user:password@db_two")

print('Writing...')

query = '''SELECT * FROM [dbo].[test_table]'''
df = pd.read_sql(query, db1)
df.to_sql('test_table', db2, schema='dbo', index=False, if_exists='replace')

print('(1) [test_table] copied.')
like image 640
pynewbee Avatar asked Oct 25 '17 21:10

pynewbee


People also ask

How do I copy a table from one database to another in python?

Steps using Python:Establish a connection with the database server and create a cursor object. Use the cursor object to execute the CREATE-SELECT or CREATE-LIKE-INSERT statements to copy a table. Check if the table definition has been copied.

How do I copy a table from one SQL Server database to another?

Right-click on the database name, then select "Tasks" > "Export data..." from the object explorer. The SQL Server Import/Export wizard opens; click on "Next". Provide authentication and select the source from which you want to copy the data; click "Next". Specify where to copy the data to; click on "Next".


Video Answer


2 Answers

SQLAlchemy is actually a good tool to use to create identical tables in the second db:

table = Table('test_table', metadata, autoload=True, autoload_with=db1)
table.create(engine=db2)

This method will also produce correct keys, indexes, foreign keys. Once the needed tables are created, you can move the data by either select/insert if the tables are relatively small or use bcp utility to dump table to disk and then load it into the second database (much faster but more work to get it to work correctly)

If using select/insert then it is better to insert in batches of 500 records or so.

like image 126
Muposat Avatar answered Oct 20 '22 13:10

Muposat


You can do something like this:

tabs = pd.read_sql("SELECT table_name FROM INFORMATION_SCHEMA.TABLES", db1)

for tab in tabs['table_name']:
    pd.read_sql("select * from {}".format(tab), db1).to_sql(tab, db2, index=False)

But it might be be awfully slow. Use SQL Server tools to do this job.

Consider using sp_addlinkedserver procedure to link one SQL Server from another. After that you can execute:

SELECT * INTO server_name...table_name FROM table_name

for all tables from the db1 database.

PS this might be done in Python + SQLAlchemy as well...

like image 43
MaxU - stop WAR against UA Avatar answered Oct 20 '22 12:10

MaxU - stop WAR against UA