I'm looking for ways to validate that data being inserted into MongoDB matches a schema, e.g. has all the required fields and correct data types. I know Mongo itself is schemaless, but if I could validate the data at the application level before passing it to the Mongo driver, that'd be good.
I've looked at JSON-Schema. My biggest hesitation there is that I can only find one Java library for schema validation, and I don't know whether I should trust it, I prefer libraries backed by Apache or Google.
I've also looked at Apache Thrift, Avro, and Protocol Buffers, which aren't specifically validation frameworks, but they do each have a concept of a schema. I'd be interested if there's a way to piggy-back off of one of those to perform validation.
Any suggestions? Or should I embrace the schemalessness of Mongo and not even bother trying to validate the data?
In the upcoming Mongo 3.2 version they added document validation (slides).
You can specify validation rules for each collection, using validator option using almost all mongo query operators (except $geoNear
, $near
, $nearSphere
, $text
, and $where
). You can read more about it in one of my answers.
Not sure if you're still looking but an object document mapper (ODM) would do the trick.
I have checked out both Morphia or Spring Data - MongoDB and either would take care of your type safety concerns since they would map your documents directly to Java classes.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With