How would you go about deep cloning a document in MongoDB (mongoid)
I've tried something like this;
original = Car.find(old_id)
@car = original.clone
@car._id = BSON::ObjectId.new
But I get problems deserialization of the values afterwards.
How can I make a deep clone with all the documents attributes except the _id?
Edit: After following Zachary's example I got some problems with a custom serialization class for the duplicated documents.
class OptionHash
include Mongoid::Fields::Serializable
# Convert the keys from Strings to Symbols
def deserialize(object)
object.symbolize_keys!
end
# Convert values into Booleans
def serialize(object)
object.each do |key, value|
object[key] = Boolean::MAPPINGS[value]
end
end
Object is nil for duplicated documents. Car.find(old_id).attributes indeed doesn't include the field with the custom serialization, why is that and how can I include it?
You don't need to call .clone on this, you can use the raw data from attributes
. For example the below method/example will give new ids throughout the entire document if it finds one.
def reset_ids(attributes)
attributes.each do |key, value|
if key == "_id" and value.is_a?(BSON::ObjectId)
attributes[key] = BSON::ObjectId.new
elsif value.is_a?(Hash) or value.is_a?(Array)
attributes[key] = reset_ids(value)
end
end
attributes
end
original = Car.find(old_id)
car_copy = Car.new(reset_ids(original.attributes))
And you now have a copy of Car. This is inefficient though as it has to go through the entire hash for the record to figure out if there are any embedded documents in an embedded document. You would be better off resetting the structure yourself if you know how it'll be, for example, if you have a parts embedded into car, then you can just do:
original = Car.find(old_id)
car_copy = Car.new(original.attributes)
car_copy._id = BSON::ObjectId.new
car_copy.parts.each {|p| p._id = BSON::ObjectId.new}
Which is a lot more efficient than just doing a generic reset.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With