I am following the Import CSV Railscast and it is straight forward.
The issue is that it deals with just a csv file containing only information in 1 model in 1 file.
Say, I have a CSV file that I am trying to import to my Listing
model. On each row/listing it has a column called Building
where the value is actually the name of the building attribute of that listing (i.e. @listing.building.name
).
How do I handle those cases in the import?
This is the closet that Ryan gets in that Railscast, which is on the Product
model in his case:
def self.import(file)
CSV.foreach(file.path, headers: true) do |row|
product = find_by_id(row["id"]) || new
product.attributes = row.to_hash.slice(*accessible_attributes)
product.save!
end
end
All that is happening is that he is checking to see if the product exists and if it does then update the attributes. If it doesn't, then create a new one.
Not quite sure how to handle the associations in this case...especially given that what needs to happen is in the event that an associated record doesn't exist, it needs to be created during this process.
So going back to my building.name
example earlier, if there is no Building.find_by_name(name)
, then it should create a new building record.
Thoughts?
try this
def self.import(file)
CSV.foreach(file.path, headers: true) do |row|
product = find_by_id(row["id"]) || new
product.attributes = row.to_hash.slice(*accessible_attributes)
product.save!
building = product.buildings.find_by_name(row['building_name'])
building ||= product.buildings.build
building.attributes = row.to_hash.slice(*build_accessible_attributes)
building.save!
end
end
UPDATE: updated answers using new rails 3 methods
def self.import(file)
CSV.foreach(file.path, headers: true) do |row|
product = where(id: row["id"])
.first_or_create!(row.to_hash.slice(*accessible_attributes))
product.buildings.where(name: row['building_name'])
.first_or_create!(row.to_hash.slice(*building_accessible_attributes))
end
end
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With