When creating a model with a models.varchar(...)
field, a varchar_pattern_ops
index is being created.
This is the table generated in postgresql
Table "public.logger_btilog"
Column | Type | Modifiers
------------------+--------------------------+-----------
md5hash | text |
id | integer | not null
Indexes:
"logger_btilog_pkey" PRIMARY KEY, btree (id)
"logger_btilog_md5hash_6454d7bb20588b61_like" btree (md5hash varchar_pattern_ops)
I want to remove that varchar_pattern_ops
index in a migration, and add a hash index in that field.
I tried doing this:
# models.py
class Btilog(models.Model):
md5hash = models.TextField(db_index=False)
[...]
And in migration also force adding db_field=False
# 0013_migration.py
# -*- coding: utf-8 -*-
from __future__ import unicode_literals
from django.db import models, migrations
class Migration(migrations.Migration):
dependencies = [
('logger', '0014_btilog_id'),
]
operations = [
# this should remove all indexes for md5hash, but it does not work
migrations.AlterField(
model_name='btilog',
name='md5hash',
field=models.TextField(null=True, blank=True, db_index=False),
),
migrations.RunSQL(
"create index logger_btilog_md5hash_hashindex on logger_btilog using hash(md5hash);",
"drop index logger_btilog_md5hash_hashindex;"
),
]
After running the migrations, this are the indexes in the database
relation | size
--------------------------------------------------------------------+---------
public.logger_btilog | 7185 MB
public.logger_btilog_md5hash_6454d7bb20588b61_like | 1442 MB
public.logger_btilog_md5hash_hashindex | 1024 MB
public.logger_btilog_pkey | 548 MB
Note that public.logger_btilog_md5hash_6454d7bb20588b61_like
is the index I want to delete. This index is being added automatically by django, see this
More info on that index
vtfx=# \d logger_btilog_md5hash_6454d7bb20588b61_like
Index "public.logger_btilog_md5hash_6454d7bb20588b61_like"
Column | Type | Definition
---------+------+------------
md5hash | text | md5hash
btree, for table "public.logger_btilog"
Footnote: I'm not confused about the usage of a hash index, I only want to do =
(strictrly equal) where
searches in md5hash
field, then (casually) a hash
index would be the fastest and will occupy less space than a btree
index (django's default)
Answer update notice:
django < 1.11: Use this answer
django >= 1.11: Use @Cesar Canassa's answer
Ok, I found some info here https://docs.djangoproject.com/en/1.8/_modules/django/db/backends/base/schema/#BaseDatabaseSchemaEditor.alter_field
And made a manual RunPython
migration to delete the varchar_pattern_ops
index using the SchemaEditor
# -*- coding: utf-8 -*-
from __future__ import unicode_literals
from django.db import models, migrations
import re
def drop_md5hash_varchar_pattern_ops_index(apps, schemaEditor):
# code based on https://docs.djangoproject.com/en/1.8/_modules/django/db/backends/base/schema/#BaseDatabaseSchemaEditor.alter_field
model = apps.get_model("logger", "Btilog")
index_names = schemaEditor._constraint_names(model, index=True)
for index_name in index_names:
if re.search('logger_btilog_md5hash_.+_like', index_name):
print 'dropping index {}'.format(index_name)
schemaEditor.execute(schemaEditor._delete_constraint_sql(schemaEditor.sql_delete_index, model, index_name))
class Migration(migrations.Migration):
dependencies = [
('logger', '0012_auto_20150529_1745'),
]
operations = [
# Remove the annoying index using a hack
migrations.RunPython(
drop_md5hash_varchar_pattern_ops_index
),
]
You can avoid the creation of the varchar_pattern_ops
"LIKE" index altogether by using the new Model Meta indexes option that was added in Django 1.11. For example, instead of writing your Model like this:
class MyModel(models.Model):
my_field = models.CharField(max_length=64, db_index=True)
You need to set the index with the Model Meta option:
class MyModel(models.Model):
my_field = models.CharField(max_length=64)
class Meta:
indexes = [
models.Index(fields=['my_field'])
]
By doing so, Django will not create the duplicated index.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With