Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

NSURLSession and stream upload in background

I have some problems with using NSURLSession to upload photos from Asset Library to the server.

At first NSURLSession doesn't support streaming upload. I got an exception when trying to using that:

@property (nonatomic, strong) NSURLSession *uploadSession;

...

_uploadSession = [NSURLSession sessionWithConfiguration:[NSURLSessionConfiguration
                backgroundSessionConfiguration:kUploadBackgroundURLSessionIdentifier] delegate:self delegateQueue:nil];

...

NSURLSessionUploadTask *task = [self.uploadSession uploadTaskWithStreamedRequest:URLRequest];

This is an exception:

Terminating app due to uncaught exception 'NSGenericException', reason: 'Upload tasks in background sessions must be from a file'

That's really strange because Apple's manual doesn't contain any information about using only uploadTaskWithRequest:fromFile: for background session. What if I would like to upload really huge video file from Asset Library? Should I save it previously to my tmp directory?

Looks like the only reason is to use uploadTaskWithRequest:fromFile: anyway, right? But then I have a question how server get to know what's part of the file is uploading right now if uploading process was interrupted and started to upload next part in background?

Should I manage something for that? Previously I used Content-Range for that in URL request if I wanted to continue upload part of the file which was started previously. Now I can't do that - I have to create an URL Request before creating upload task and looks like NSURLSession have to do something like that automatically for me?

Does anyone do something like that already? Thanks

like image 552
gN0Me Avatar asked Nov 14 '13 10:11

gN0Me


1 Answers

Convert to NSData and copy and write in app folder

ALAsset *asset = [cameraRollUploadImages objectAtIndex:startCount];
ALAssetRepresentation *representation = [asset defaultRepresentation];

// create a buffer to hold the data for the asset's image
uint8_t *buffer = (Byte *)malloc(representation.size);// copy the data from the asset into the buffer
NSUInteger length = [representation getBytes:buffer 
                                  fromOffset:0 
                                      length:representation.size 
                                       error:nil];

// convert the buffer into a NSData object, free the buffer after
NSData *image = [[NSData alloc] initWithBytesNoCopy:buffer 
                                             length:representation.size
                                       freeWhenDone:YES];
like image 84
Mustaque Ahmed Avatar answered Oct 21 '22 17:10

Mustaque Ahmed