I am trying to get images from my test devices camera roll to render as thumbnails. I have successfully fetched the images from the camera roll and displayed them within a series of Image elements in a list view but they take a really long time to load in. Also, I read in the React Native docs that the Image element will pick the correct image size for the space it will render into.
This is from the docs.
iOS saves multiple sizes for the same image in your Camera Roll, it is very important to pick the one that's as close as possible for performance reasons. You wouldn't want to use the full quality 3264x2448 image as source when displaying a 200x200 thumbnail. If there's an exact match, React Native will pick it, otherwise it's going to use the first one that's at least 50% bigger in order to avoid blur when resizing from a close size. All of this is done by default so you don't have to worry about writing the tedious (and error prone) code to do it yourself. https://facebook.github.io/react-native/docs/image.html#best-camera-roll-image
The code I'm using to read the images is super simple.
CameraRoll.getPhotos({
first: 21,
assetType: 'Photos'
}, (data) => {
console.log(data);
var images = data.edges.map((asset) => {
return {
uri: asset.node.image.uri
};
});
this.setState({
images: this.state.images.cloneWithRows(images)
});
}, () => {
this.setState({
retrievePhotoError: messages.errors.retrievePhotos
});
});
And then to render it I have these functions.
renderImage(image) {
return <Image resizeMode="cover" source={{uri: image.uri}} style={[{
height: imageDimensions, // imageDimensions == 93.5
width: imageDimensions
}, componentStyles.thumbnails]}/>;
},
render() {
<ListView
automaticallyAdjustContentInsets={false}
contentContainerStyle={componentStyles.row}
dataSource={this.state.images}
renderRow={this.renderImage}
/>
}
What am I missing here? I'm going crazy!!!
OK, it is possible but you'll have to put your hands in the Objective-C party of react-native.
You can check this.
You'll have to modify the RCTCameraRollManager.m file.
You'll have to add these lines :
ALAssetsLibrary *library = [[ALAssetsLibrary alloc] init];
NSURL *url = [[NSURL alloc] initWithString:uri];
[library assetForURL:url resultBlock:^(ALAsset *asset) {
// Create an ALAssetRepresentation object using our asset
// and turn it into a bitmap using the CGImageRef opaque type.
CGImageRef imageRef = [asset thumbnail];
CGSize dimensions = [UIImage imageWithCGImage:imageRef].size;
// Create UIImageJPEGRepresentation from CGImageRef
NSData *imageData = UIImageJPEGRepresentation([UIImage imageWithCGImage:imageRef], 0.1);
before the [assets addObject:@{...}]
method and add :
} failureBlock:^(NSError *error) {
NSLog(@"that didn't work %@", error);
}];
after the [assets addObject:@{...}]
method.
You can also add the prop @"base64": base64Encoded
in the [assets addObject:@{
method.
Check the link, you have the "new file" that gives you the thumbnail (125 x 125 size).
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With