What's the correct way to rescue an exception and simply continue processing? I have an app that has Folders and Items, with a habtm relationship through a join table called folders_items. That table has a unique constraint ensuring that there are no duplicate item/folder combinations. If the user tries to add an item to the same folder several times, I obviously don't want the additional rows added; but I don't want to stop processing, either.
Postgres automatically throws an exception when the unique constraint is violated, so I tried to ignore it in the controller as follows:
rescue PG::Error, :with => :do_nothing
def do_nothing
end
This works fine on single insertions. The controller executes the render with a status code of 200. However, I have another method that does bulk inserts in a loop. In that method, the controller exits the loop when it encounters the first duplicate row, which is not what I want. At first, I thought that the loop must be getting wrapped in a transaction that's getting rolled back, but it isn't -- all the rows prior to the duplicate get inserted. I want it to simply ignore the constraint exception and move to the next item. How do I prevent the PG::Error exception from interrupting this?
If you put a Rails validator on your model, then you can control your flow without throwing an exception.
class FolderItems
belongs_to :item
belongs_to :folder
validates_uniqueness_of :item, scope: [:folder], on: :create
end
Then you can use
FolderItem.create(folder: folder, item: item)
It will return true if the association was created, false if there was an error. It will not throw an exception. Using FolderItem.create!
would throw an exception if the association is not created.
The reason you are seeing PG errors is because Rails itself thinks that the model is valid on save, because the model class does not have a uniqueness constraint in Rails. Of course, you have a unique constraint in the DB, which surprises Rails and causes it to blow up at the last minute.
If performance is critical then perhaps ignore this advice. Having a uniqueness constraint on a Rails model causes it to perform a SELECT
before every INSERT
in order for it to do uniqueness validation at the Rails level, potentially doubling the number of queries your loop is performing. Just catching the errors at the database level like you are doing might be a reasonable trade of elegance for performance.
(edit) TL;DR: Always have the unique constraint in the DB. Also having a model constraint will allow ActiveRecord/ActiveModel validation before the DB throws an error.
In general, your exception handling should be at the closest point to the error that you can do something sensible with the exception. In your case, you'd want your rescue
inside your loop, for example:
stuff.each do |h|
begin
Model.create(h)
rescue ActiveRecord::RecordNotUnique => e
next if(e.message =~ /unique.*constraint.*INDEX_NAME_GOES_HERE/)
raise
end
end
A couple points of interest:
ActiveRecord::RecordNotUnique
error rather than the underlying PG::Error
. AFAIK, you'd get a PG::Error
if you were talking directly to the database rather than going through ActiveRecord.INDEX_NAME_GOES_HERE
with the real name of the unique index.next if(...)
bit followed by the argumentless raise
(i.e. re-raise the exception if it isn't what you're expecting to see).If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With