Right now, I have a server call kicking back the following Ruby hash:
{
"id"=>"-ct",
"factualId"=>"",
"outOfBusiness"=>false,
"publishedAt"=>"2012-03-09 11:02:01",
"general"=>{
"name"=>"A Cote",
"timeZone"=>"EST",
"desc"=>"À Côté is a small-plates restaurant in Oakland's charming
Rockridge district. Cozy tables surround large communal tables in both
the main dining room and on the sunny patio to create a festive atmosphere.
Small plates reflecting the best of seasonal Mediterranean cuisine are served
family-style by a friendly and knowledgeable staff.\nMenu items are paired with
a carefully chosen selection of over 40 wines by the glass as well as a highly
diverse bottled wine menu. Specialty drinks featuring fresh fruits, rare
botaniques and fine liqueurs are featured at the bar.",
"website"=>"http://acoterestaurant.com/"
},
"location"=>{
"address1"=>"5478 College Ave",
"address2"=>"",
"city"=>"Oakland",
"region"=>"CA",
"country"=>"US",
"postcode"=>"94618",
"longitude"=>37.84235,
"latitude"=>-122.25222
},
"phones"=>{
"main"=>"510-655-6469",
"fax"=>nil
},
"hours"=>{
"mon"=>{"start"=>"", "end"=>""},
"tue"=>{"start"=>"", "end"=>""},
"wed"=>{"start"=>"", "end"=>""},
"thu"=>{"start"=>"", "end"=>""},
"fri"=>{"start"=>"", "end"=>""},
"sat"=>{"start"=>"", "end"=>""},
"sun"=>{"start"=>"", "end"=>""},
"holidaySchedule"=>""
},
"businessType"=>"Restaurant"
}
It's got several attributes which are nested, such as:
"wed"=>{"start"=>"", "end"=>""}
I need to convert this object into a unnested hash in Ruby. Ideally, I'd like to detect if an attribute is nested, and respond accordingly, I.E. when it determines the attribute 'wed
' is nested, it pulls out its data and stores in the fields 'wed-start
' and 'wed-end
', or something similar.
Anyone have any tips on how to get started?
EDIT: the sparsify gem was released as a general solution to this problem.
Here's an implementation I worked up a couple months ago. You'll need to parse the JSON into a hash, then use Sparsify to sparse the hash.
# Extend into a hash to provide sparse and unsparse methods.
#
# {'foo'=>{'bar'=>'bingo'}}.sparse #=> {'foo.bar'=>'bingo'}
# {'foo.bar'=>'bingo'}.unsparse => {'foo'=>{'bar'=>'bingo'}}
#
module Sparsify
def sparse(options={})
self.map do |k,v|
prefix = (options.fetch(:prefix,[])+[k])
next Sparsify::sparse( v, options.merge(:prefix => prefix ) ) if v.is_a? Hash
{ prefix.join(options.fetch( :separator, '.') ) => v}
end.reduce(:merge) || Hash.new
end
def sparse!
self.replace(sparse)
end
def unsparse(options={})
ret = Hash.new
sparse.each do |k,v|
current = ret
key = k.to_s.split( options.fetch( :separator, '.') )
current = (current[key.shift] ||= Hash.new) until (key.size<=1)
current[key.first] = v
end
return ret
end
def unsparse!(options={})
self.replace(unsparse)
end
def self.sparse(hsh,options={})
hsh.dup.extend(self).sparse(options)
end
def self.unsparse(hsh,options={})
hsh.dup.extend(self).unsparse(options)
end
def self.extended(base)
raise ArgumentError, "<#{base.inspect}> must be a Hash" unless base.is_a? Hash
end
end
usage:
external_data = JSON.decode( external_json )
flattened = Sparsify.sparse( external_data, :separator => '-' )
This was originally created because we were working with storing a set of things in Mongo, which allowed us to use sparse keys (dot-separated) on updates to update some contents of a nested hash without overwriting unrelated keys.
Here's a first cut at a complete solution. I'm sure you can write it more elegantly, but this seems fairly clear. If you save this in a Ruby file and run it, you'll get the output I show below.
class Hash
def unnest
new_hash = {}
each do |key,val|
if val.is_a?(Hash)
new_hash.merge!(val.prefix_keys("#{key}-"))
else
new_hash[key] = val
end
end
new_hash
end
def prefix_keys(prefix)
Hash[map{|key,val| [prefix + key, val]}].unnest
end
end
p ({"a" => 2, "f" => 5}).unnest
p ({"a" => {"b" => 3}, "f" => 5}).unnest
p ({"a" => {"b" => {"c" => 4}, "f" => 5}}).unnest
Output:
{"a"=>2, "f"=>5}
{"a-b"=>3, "f"=>5}
{"a-b-c"=>4, "a-f"=>5}
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With