I've got ~15k rows in MSSQL 2005 that I want to migrate into CouchDB, where one row is one document. I have a CLR-UDF that writes n rows to an schema-bound XML file. I have an XSL transform that converts the schema-bound XML to JSON.
With these existing tools I'm thinking I can go MSSQL to XML to JSON. If I batch n rows per JSON file, I can script cURL to loop through the files and POST them to CouchDB using the bulk API _bulk_docs.
Will this work? Has anybody done a migration like this before? Can you recommend a better way?
So far I did some conversions from legacy SQL databases to CouchDB. I always had a somewhat different approach.
My importing code usually looks like this:
def main():
 options = parse_commandline()
 server = couchdb.client.Server(options.couch) 
 db = server[options.db] 
 for kdnnr in get_kundennumemrs():
    data = vars(get_kunde(kdnnr)) 
    doc = {'name1': data.get('name1', ''),
           'strasse': data.get('strasse', ''),
           'plz': data.get('plz', ''), 'ort': data.get('ort', ''),
           'tel': data.get('tel', ''), 'kundennr': data.get('kundennr', '')}
    # update existing doc or insert a new one
    newdoc = db.get(kdnnr, {})
    newdoc.update(doc)
    if newdoc != db.get(kdnnr, {}):
        db[kdnnr] = newdoc 
                        If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With