I have a table that contains millions of records.
I'm altering one column data type to another (money to decimal)
Noticing that is taking alot of time executing the alter statement. Probably due to the fact that there is alot of data.
Is there a way to increase the performance for this scenario?
All millions of rows have to changed at the same time in one transaction
Another option is to create a new table, insert in batches, drop old table, rename new table. However, it's liable to take longer this way.
One way is to create a temporary table with the new column type, copy the data from the original table to the temp, drop the original, and then rename the temp.
CREATE TABLE dbo.tmp_MyTable
(ID Integer, Name varchar (100), MyChangedField decimal (8,2))
INSERT INTO dbo.tmp_MyTable SELECT * FROM dbo.MyTable
DROP TABLE dbo.MyTable
EXECUTE sp_rename N'dbo.tmp_MyTable', N'MyTable', 'OBJECT'
Keep in mind, this is an overly simple example. If your original table has indexes, keys, and/or default values, you'll have to handle them as well.
One easy trick is to make the change in the SQL Server Management Studio table designer and generate the change script to see everthing that needs to be done.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With