We have a table, with an auto-increment int
primary key, whose max value is now at the limit for the T-SQL int
type. When we try to re-seed the table (because there are large gaps in the keys, nowhere near enough rows as the max int
value), it somehow keeps getting reset to the max int
value.
Obviously, this causes serious problems. The PK never should have gotten to this value and changing the data type would be a big task. Re-seeding should be sufficient, but it's not working!
How could it keep getting reset?
Edit: to clarify the situation, this query SELECT MIN(CategoryID), MAX(CategoryID) FROM dbo.tblCategories
returns -2147483647, 2147483647... meaning that there are actual PK values at the min and max of the int
type.
You can't because it's not unique. Primary keys must be unique. You should create a key using both groupid and lang_id. Primary key means that if you ask for a row with a specific key you get back one and only one row.
Primary key will not accept NULL values whereas Unique key can accept NULL values. A table can have only one primary key whereas there can be multiple unique key on a table. A Clustered index automatically created when a primary key is defined whereas Unique key generates the non-clustered index.
When the AUTO_INCREMENT column reaches the upper limit of data type then the subsequent effort to generate the sequence number fails. That is why it is advised to use a large enough integer data type for the AUTO_INCREMENT column to hold the maximum sequence value required by us.
PRIMARY KEY CONSTRAINT cannot be altered, you may only drop it and create again.
You can "reseed" a table so that the next-assigned identity column will be less than the current max value in the table. However, any subsquent DBCC CHECKIDENT
will reset the internal counter back to the max value currently in the column. Perhaps that's where the reset is coming from? And of course an insert will eventually hit a duplicate value, resulting in Interesting Times for the production support crew.
By and large, you're in trouble. I recommend working up a one-time script to remove/reset the uber-high ID values. Updating the rows (and all related foreign key values) is one option, though it would involve having to disable the foreign key constraints and I don't know what all else, so I wouldn't recommend it. The other would be to create exact copies of all the data for the "high-id" items using more pragmatic Id values, and then delete the original entries. This is a hack, but it would result in a much more maintainable database.
Oh, and track down the folks who put those high-id values in, and--at the very least--revoke their access rights to the database.
I was involved with a project that had a similar issue - though in our case, someone had chosen a field with a 4-digit number as the primary key, which kinda didn't work as we approached 9999 records...
In our case, we created a new column on the table and populated that, changed the primary key to that column, and repointed all our foreign key relationships to the new key.
Very messy, but it did work.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With