I have been working with base64 encoding. I have successfully encoded images to NSString from NSData, I have also decoded it back to NSData.
So right now I want to store images in Core Data. But would it be best to store an NSString, NSData or the third transformable?
The reason I convert images to NSString is because I want to store it in XML too.
Thanks in advance.
Transformable (as per @timthetoolman) is the easiest way. If your images are large, though, as of iOS 5, the right way to do this is to use the "Binary Data" attribute type and choose "Allows External Storage" so that Core Data can store the large blob of data outside of the database. This can be much more efficient.
Use an NSValueTransformer to convert from your image to an NSData object and then store the data blob in Core Data. You can register your value transformer subclass in the modeling tool. You can check out Apple's PhotoLocations example or this tutorial shows how.
Edit for completeness: as others have pointed out too large a data blob will cause performance issues. As pointed out by @Jesse, iOS5 has an optimization where if the data blob is too large, then Core Data will store it outside of the persistent store. If you have to target pre-iOS5 and the image is too large then you should save the file somewhere in the sandbox and store the URL in the Core Data store. A good discussion in the Apple Dev Forums is here and discusses the limits of storing data blobs in Core Data.
Good Luck
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With