I have a Project model which accepts nested attributes for Task.
class Project < ActiveRecord::Base
has_many :tasks
accepts_nested_attributes_for :tasks, :allow_destroy => :true
end
class Task < ActiveRecord::Base
validates_uniqueness_of :name
end
Uniqueness validation in Task model gives problem while updating Project.
In edit of project i delete a task T1 and then add a new task with same name T1, uniqueness validation restricts the saving of Project.
params hash look something like
task_attributes => { {"id" =>
"1","name" => "T1", "_destroy" =>
"1"},{"name" => "T1"}}
Validation on task is done before destroying the old task. Hence validation fails.Any idea how to validate such that it doesn't consider task to be destroyed?
Andrew France created a patch in this thread, where the validation is done in memory.
class Author
has_many :books
# Could easily be made a validation-style class method of course
validate :validate_unique_books
def validate_unique_books
validate_uniqueness_of_in_memory(
books, [:title, :isbn], 'Duplicate book.')
end
end
module ActiveRecord
class Base
# Validate that the the objects in +collection+ are unique
# when compared against all their non-blank +attrs+. If not
# add +message+ to the base errors.
def validate_uniqueness_of_in_memory(collection, attrs, message)
hashes = collection.inject({}) do |hash, record|
key = attrs.map {|a| record.send(a).to_s }.join
if key.blank? || record.marked_for_destruction?
key = record.object_id
end
hash[key] = record unless hash[key]
hash
end
if collection.length > hashes.length
self.errors.add_to_base(message)
end
end
end
end
As I understand it, Reiner's approach about validating in memory would not be practical in my case, as I have a lot of "books", 500K and growing. That would be a big hit if you want to bring all into memory.
The solution I came up with is to:
Place the uniqueness condition in the database (which I've found is always a good idea, as in my experience Rails does not always do a good job here) by adding the following to your migration file in db/migrate/:
add_index :tasks [ :project_id, :name ], :unique => true
In the controller, place the save or update_attributes inside a transaction, and rescue the Database exception. E.g.,
def update
@project = Project.find(params[:id])
begin
transaction do
if @project.update_attributes(params[:project])
redirect_to(project_path(@project))
else
render(:action => :edit)
end
end
rescue
... we have an exception; make sure is a DB uniqueness violation
... go down params[:project] to see which item is the problem
... and add error to base
render( :action => :edit )
end
end
end
For Rails 4.0.1, this issue is marked as being fixed by this pull request, https://github.com/rails/rails/pull/10417
If you have a table with a unique field index, and you mark a record for destruction, and you build a new record with the same value as the unique field, then when you call save, a database level unique index error will be thrown.
Personally this still doesn't work for me, so I don't think it's completely fixed yet.
Rainer Blessing's answer is good. But it's better when we can mark which tasks are duplicated.
class Project < ActiveRecord::Base
has_many :tasks, inverse_of: :project
accepts_nested_attributes_for :tasks, :allow_destroy => :true
end
class Task < ActiveRecord::Base
belongs_to :project
validates_each :name do |record, attr, value|
record.errors.add attr, :taken if record.project.tasks.map(&:name).count(value) > 1
end
end
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With