from pandas import DataFrame
import pyodbc
cnxn = pyodbc.connect(databasez)
cursor.execute("""SELECT ID, NAME AS Nickname, ADDRESS AS Residence FROM tablez""")
DF = DataFrame(cursor.fetchall())
This is fine to populate my pandas DataFrame. But how do I get
DF.columns = ['ID', 'Nickname', 'Residence']
straight from cursor? Is that information stored in cursor at all?
Approach: Connect to a database using the connect() method. Create a cursor object and use that cursor object created to execute queries in order to create a table and insert values into it. Use the description keyword of the cursor object to get the column names.
wiki. Cursors represent a database cursor (and map to ODBC HSTMTs), which is used to manage the context of a fetch operation. Cursors created from the same connection are not isolated, i.e., any changes done to the database by a cursor are immediately visible by the other cursors.
Pyodbc is an open source Python module that makes accessing ODBC databases simple. It implements the DB API 2.0 specification. Using pyodbc, you can easily connect Python applications to data sources with an ODBC driver.
You can get the columns from the cursor description:
columns = [column[0] for column in cursor.description]
Recent pandas have a higher level read_sql
functions that can do this for you
import pyodbc
import pandas as pd
cnxn = pyodbc.connect(databasez)
DF = pd.read_sql_query("SELECT ID, NAME AS Nickname, ADDRESS AS Residence FROM tablez", cnxn)
In case you are experiencing the NoneType
error from the code provided by Matti John, make sure to make the cursor.description
call after you have retrieved data from the database. An example:
cursor = cnxn.cursor()
cursor.execute("SELECT * FROM my_table")
columns = [column[0] for column in cursor.description]
This fixed it for me.
Improving on the previous answer, in the context of pandas, I found this does exactly what I expect:
DF.columns = DataFrame(np.matrix(cursor.description))[0]
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With