I'm running MS SQL Server and am trying to perform a JOIN
between two tables located in different databases (on the same server). If I connect to the server using pyodbc (without specifying a database), then the following raw SQL works fine.
SELECT * FROM DatabaseA.dbo.tableA tblA
INNER JOIN DatabaseB.dbo.tableB tblB
ON tblA.id = tblB.id
Unfortunately, I just can't seem to get the analog to work using SQLAlchemy. I've seen this topic touched on in a few places:
Most recommend to use different engines / sessions, but I crucially need to perform joins between the databases, so I don't think this approach will be helpful. Another typical suggestion is to use the schema
parameter, but this does not seem to work for me. For example the following does not work.
engine = create_engine('mssql+pyodbc://...') #Does not specify database
metadataA = MetaData(bind=engine, schema='DatabaseA.dbo', reflect=True)
tableA = Table('tableA', metadataA, autoload=True)
metadataB = MetaData(bind=engine, schema='DatabaseB.dbo', reflect=True)
tableB = Table('tableB', metadataB, autoload=True)
I've also tried varients where schema='DatabaseA'
and schema='dbo'
. In all cases SQLAlchemy throws a NoSuchTableError
for both tables A and B. Any ideas?
If you can create a synonym in one of the databases, you can keep your query local to that single database.
USE DatabaseB;
GO
CREATE SYNONYM dbo.DbA_TblA FOR DatabaseA.dbo.tableA;
GO
Your query then becomes:
SELECT * FROM dbo.DbA_TblA tblA
INNER JOIN dbo.tableB tblB
ON tblA.id = tblB.id
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With