Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

SQLAlchemy: inspection of column type size gives odd result

Is it possible to query a column for its maximum possible size of data (in bytes) that can be stored in it? For example, say I declare a column using

content = Column(LargeBinary)

then how can I query information about content? Following the inspection approach suggested in this question:

table = File.__table__
field = table.c["content"]
print("max=" + field.type.length)

I get a max=None, whereas I would have expected this to be max=65535 considering field.type=BLOB. What am I doing wrong?

like image 994
Jens Avatar asked Sep 29 '22 00:09

Jens


1 Answers

As far as I understand, the maximum length of a column is dialect specific and this information is not stored inside the sqlalchemy sources. Though, depending on the backend, you can get it dynamically. For example, for mysql you can retrieve it from INFORMATION_SCHEMA.COLUMNS table:

q = select(["CHARACTER_MAXIMUM_LENGTH"]).select_from("INFORMATION_SCHEMA.COLUMNS").where(
    and_("table_name = '%s'" % File.__tablename__,
         "table_schema = '%s'" % schema,
         "column_name = '%s'" % column_name))
result = session.execute(q)
print(result.first()[0])  # tested - prints 65535

I bet there is a more beautiful way to write the query.

See also:

  • Determine max length allowed in a column in mysql
  • How to check if PostgreSQL schema exists using SQLAlchemy? (information_schema query example)

field.type.length refers to the user-defined maximum length which is initialized to None if not provided explicitly:

def __init__(self, length=None):
    self.length = length

And since you are not providing the length argument, you are getting None.

like image 56
alecxe Avatar answered Oct 03 '22 01:10

alecxe