I am missing the table name in IntegrityError of Django:
Traceback (most recent call last):
...
return self.cursor.execute(sql, params)
File ".../django/db/utils.py", line 94, in __exit__
six.reraise(dj_exc_type, dj_exc_value, traceback)
File ".../django/db/backends/utils.py", line 64, in execute
return self.cursor.execute(sql, params)
IntegrityError: null value in column "manager_slug" violates not-null constraint
DETAIL: Failing row contains (17485, null, 2017-10-10 09:32:19, , 306).
Is there a way to see which table the INSERT/UPDATE is accessing?
We use PostgreSQL 9.6.
This is a generic question: How to get a better error message?
This is not a question about this particular column. I found the relevant table and column very soon. But I want to improve the error message which is from our CI system. The next time I want to see the table name immediately.
I know that I could reveal the missing information easily with a debugger if I see this error during software development. But in my case this happens in production, and I have only the stacktrace like above.
The exception message in this traceback is the original message from the database driver. It is useful to know this and the traceback if anything is googled, reported etc.
The exception class is the same django.db.utils.IntegrityError for all backends, but the message or rather arguments depend on the backend:
null value in column "manager_slug" violates not-null constraint\n
DETAILS...\n
(1048, "Column 'manager_slug' cannot be null")
NOT NULL constraint failed: appname_modelname.manager_slug
The table name is visible only with sqlite3
backend. Some backends use only a string argument of exception, but mysql
uses two arguments: a numeric error code and a message. (I like to accept that it is a general question, not only PostgreSQL.) Authors of some backends expect that the author of the app will know the table name directly or from SQL, but it is not true with general ORM packages. There is no preferable and generally acceptable way, how to extend the message even if it can be done technically perfect.
Development and debugging are easy:
python manage.py test --debug-sql
: "Prints logged SQL queries on failure."...but you are probably asking for a run-time error in production.
I guess about your possible intention in a so general question, what direction could be interesting for you.
A) The most important information from the traceback is usually a few lines above the many lines with ".../django/db/...". It is perfectly easy for a guru. It can be used very probably if the code is not so dynamic and general like a Django admin site, where no code near myobj.save()
call (neither in parent frames) contains an explicit model name. Example:
# skip some initial universal code in ".../django/..."
...
# our apps start to be interesting... (maybe other installed app)
...
# START HERE: Open this line in the editor. If the function is universal, jump to the previous.
File ".../me/app/...py", line 47, in my...
my_obj.save()
# skip many stack frames .../django/db/... below
File ".../django/db/models/base.py", line 734, in save
# self.save_base(... # this line 733 is not visible
force_update=force_update, update_fields=update_fields)
...
# interesting only sql and params, but not visible in production
File ".../django/db/backends/utils.py", line 64, in execute
return self.cursor.execute(sql, params)
IntegrityError (or DataError similarly)...
B) Catch the information by a common ancestor of your models
class ...(models.Model):
def save(self, *args, **wkargs):
try:
super(..., self).save(*args, **wkargs)
except django.db.utils.IntegrityError as exc:
new_message = 'table {}'.format(self._meta.db_table)
exc.extra_info = new_message
# this is less compatible, but it doesn't require additional reading support
# exc.args = exc.args + (new_message,)
reraise
This could complicate debugging with multiple inheritance.
C) An implementation in Django db would be better, but I can not imagine that it will be accepted and not reverted after some issue.
if you can create sql function you can try:
create function to get last sequence value get_sequence_last_value (original post)
CREATE FUNCTION public.get_sequence_last_value(name) RETURNS int4 AS '
DECLARE
ls_sequence ALIAS FOR $1;
lr_record RECORD;
li_return INT4;
BEGIN
FOR lr_record IN EXECUTE ''SELECT last_value FROM '' || ls_sequence LOOP
li_return := lr_record.last_value;
END LOOP;
RETURN li_return;
END;' LANGUAGE 'plpgsql' VOLATILE;
after it get table with sequence more then in error stack, and has column manager_slug
SELECT table_name, column_name
FROM information_schema.columns
WHERE table_name in (
SELECT table_name
FROM (
SELECT table_name,
get_sequence_last_value(
substr(column_default, 10, strpos(column_default, '::regclass') - 11)
) as lv
FROM information_schema.columns
WHERE column_default LIKE 'nextval%'
) as t_seq_lv
WHERE lv > 17485
)
AND column_name = 'manager_slug';
i understand that the solution not full, but any way i hope it can help you
I would suggest to use Sentry (https://sentry.io/welcome/). In Sentry Issues you can observe all local variables for all parts of a stack trace.
The best solution that I have found to your problem is overwriting DataBaseErrorWrapper
method, to do that go to \django\db\utils.py
and in the line 86 replace dj_exc_value = dj_exc_type(*exc_value.args)
for:
if exec_value.diag:
a, b = exc_value.args + ("In the table '%s'" % exc_value.diag.table_name,)
dj_exc_value = dj_exc_type(a + b)
else:
dj_exc_value = dj_exc_type(*exc_value.args)
and if the IntegrityError appears, the message should work
django.db.utils.IntegrityError: null value in column "signature" violates not-null constraint
DETAIL: Failing row contains (89, null).
In the table 'teachers_signature'
I am not sure but this should work with these exceptions:
DataError
OperationalError
IntegrityError
InternalError
ProgrammingError
NotSupportedError
It works for me, tell me if works to you. Remember edit the file with the Django folder that you are working
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With