I am connecting to a SQL Server database using SQLAlchemy (with the pymssql
driver).
import sqlalchemy
conn_string = f'mssql+pymssql://{uid}:{pwd}@{instance}/?database={db};charset=utf8'
sql = 'SELECT * FROM FAKETABLE;'
engine = sqlalchemy.create_engine(conn_string)
connection = engine.connect()
result = connection.execute(sql)
result.cursor.description
which results in:
(('col_1', 1, None, None, None, None, None),
('col_2', 1, None, None, None, None, None),
('col_3', 4, None, None, None, None, None),
('col_4', 3, None, None, None, None, None),
('col_5', 3, None, None, None, None, None))
As per PEP 249 (cursor's .description
attribute):
The first two items (name and type_code) are mandatory, the other five are optional and are set to None if no meaningful values can be provided.
I am assuming the integers (1, 1, 4, 3, 3)
are column types.
My two questions:
FWIW, I get the same result when using raw_connection()
instead of connect()
.
Came across three questions along similar lines (which do not answer this specific question). I need to use the connect()
+ execute()
approach.
If no, is it possible to get the SQL data types?
SQL Server function sys.dm_exec_describe_first_result_set could be used to get SQL column's data type directly for provided query:
SELECT column_ordinal, name, system_type_name, *
FROM sys.dm_exec_describe_first_result_set('here goes query', NULL, 0) ;
In your example:
sql = """SELECT column_ordinal, name, system_type_name
FROM sys.dm_exec_describe_first_result_set('SELECT * FROM FAKETABLE', NULL, 0) ;"""
For:
CREATE TABLE FAKETABLE(id INT, d DATE, country NVARCHAR(10));
SELECT column_ordinal, name, system_type_name
FROM sys.dm_exec_describe_first_result_set('SELECT * FROM FAKETABLE', NULL, 0) ;
+-----------------+----------+------------------+
| column_ordinal | name | system_type_name |
+-----------------+----------+------------------+
| 1 | id | int |
| 2 | d | date |
| 3 | country | nvarchar(10) |
+-----------------+----------+------------------+
db<>fiddle demo
Looking at PEP249: type_code
does not look like to be the same through different DB type.
So this answer will be focus on MS SQL Server.
- How to map these integers to data types (like char, integer, etc.)?
You can create a dict of type_code
to type_object
using the following code:
import inspect
import pymssql
code_map = {
type_obj.value: (type_name, type_obj)
for type_name, type_obj
in inspect.getmembers(
pymssql,
predicate=lambda x: isinstance(x, pymssql.DBAPIType),
)
}
This will produce following dict:
{2: ('BINARY', <DBAPIType 2>),
4: ('DATETIME', <DBAPIType 4>),
5: ('DECIMAL', <DBAPIType 5>),
3: ('NUMBER', <DBAPIType 3>),
1: ('STRING', <DBAPIType 1>)}
Unfortunately, I do not have access to a running instance of MS SQL Server. So I am not able to check if type results match with your example.
- Are these SQL data types? If no, is it possible to get the SQL data types?
Looking at the PEP and this result: this fields are not SQL data types. This are "Type Object".
DB API does not look like to provide methods / functions to inspect query results metadata. The API just provide a way to bind data types from SQL to python types.
If you need to get exact SQL data type, then you must write server specific SQL query.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With