I am trying to coax AVFoundation to read from a custom URL. The custom URL stuff works. The code below creates a NSData with a movie file:
NSData* movieData = [NSData dataWithContentsOfURL:@"memory://video"];
I've set up a AVAssetResourceLoader object using the following code:
NSURL* url = [NSURL URLWithString:@"memory://video"];
AVURLAsset* asset = [[AVURLAsset alloc] initWithURL:url options:nil];
AVAssetResourceLoader* loader = [asset resourceLoader];
[loader setDelegate:self queue:mDispatchQueue];
The dispatch queue is concurrent.
I then try to extract the first frame from the movie:
AVAssetImageGenerator* imageGen = [AVAssetImageGenerator assetImageGeneratorWithAsset:asset];
CMTime time = CMTimeMakeWithSeconds(0, 600);
NSError* error = nil;
CMTime actualTime;
CGImageRef image = [imageGen copyCGImageAtTime:time
actualTime:&actualTime
error:&error];
if (error) NSLog(@"%@", error);
But when I run this but of code I get:
2013-02-21 10:02:22.197 VideoPlayer[501:907] Error Domain=AVFoundationErrorDomain Code=-11800 "The operation could not be completed" UserInfo=0x1f863090 {NSLocalizedDescription=The operation could not be completed, NSUnderlyingError=0x1e575a90 "The operation couldn’t be completed. (OSStatus error 268451843.)", NSLocalizedFailureReason=An unknown error occurred (268451843)}
The implementation of the delegate method is:
- (BOOL)resourceLoader:(AVAssetResourceLoader *)resourceLoader shouldWaitForLoadingOfRequestedResource:(AVAssetResourceLoadingRequest *)loadingRequest
{
NSData* data = [NSData dataWithContentsOfURL:loadingRequest.request.URL];
[loadingRequest finishLoadingWithResponse:nil data:data redirect:nil];
return YES;
}
Now, my question is, am I implementing the method correctly? Does anyone know if what I am doing correct?
Thanks.
EDIT: The movie I am fetching in its entirety is a single frame movie.
User NSURLComponent along with the scheme = "enc" in order to call AVAssetResourceLoaderDelegate method.
let urlComponents = NSURLComponents(url: video_url, resolvingAgainstBaseURL: false)
urlComponents?.scheme = "enc"
let avAsset = AVURLAsset(url: (urlComponents?.url)!, options: ["AVURLAssetHTTPHeaderFieldsKey": headers])
avAsset.resourceLoader.setDelegate(self, queue: DispatchQueue(label: "AVARLDelegateDemo loader"))
I have implemented a working version of this method. It took me a while to figure out. But the resulting app now works. Which suggests that the code is okay.
My app includes a media file which I did not want to ship in the package unencrypted. I wanted to dynamically decrypt the file. (a block at a time).
The method has to respond both to the content request (tell the player what it is loading) And to data requests (give the player some data). The first time the method is called, there is always a content request. Then there will be a series of data requests.
The player is greedy. It always asks for the entire file. You are not obliged to provide that. It asks for the whole cake. You can give it one slice.
I hand the media player blocks of data. Usually 1 MB at a time. With a special case to handle the smaller final block. Blocks are usually asked for in sequence. But you need to be able to cope with out-of-sequence requests too.
- (BOOL) resourceLoader:(AVAssetResourceLoader *)resourceLoader shouldWaitForLoadingOfRequestedResource:(AVAssetResourceLoadingRequest *)loadingRequest
{
NSURLRequest* request = loadingRequest.request;
AVAssetResourceLoadingDataRequest* dataRequest = loadingRequest.dataRequest;
AVAssetResourceLoadingContentInformationRequest* contentRequest = loadingRequest.contentInformationRequest;
//handle content request
if (contentRequest)
{
NSError* attributesError;
NSString* path = request.URL.path;
_fileURL = request.URL;
if (_fileHandle == nil)
{
_fileHandle = [NSFileHandle fileHandleForReadingAtPath:path];
}
// fire up the decryption here..
// for example ...
if (_decryptedData == nil)
{
_cacheStart = 1000000000;
_decryptedData = [NSMutableData dataWithLength:BUFFER_LENGTH+16];
CCCryptorCreate(kCCDecrypt, kCCAlgorithmAES128, kCCOptionPKCS7Padding, [sharedKey cStringUsingEncoding:NSISOLatin1StringEncoding], kCCKeySizeAES256, NULL, &cryptoRef);
}
NSDictionary *fileAttributes = [[NSFileManager defaultManager] attributesOfItemAtPath:path error:&attributesError];
NSNumber *fileSizeNumber = [fileAttributes objectForKey:NSFileSize];
_fileSize = [fileSizeNumber longLongValue];
//provide information about the content
_mimeType = @"mp3";
contentRequest.contentType = _mimeType;
contentRequest.contentLength = _fileSize;
contentRequest.byteRangeAccessSupported = YES;
}
//handle data request
if (dataRequest)
{
//decrypt a block of data (can be any size you want)
//code omitted
NSData* decodedData = [NSData dataWithBytes:outBuffer length:reducedLen];
[dataRequest respondWithData:decodedData];
[loadingRequest finishLoading];
}
return YES;
}
I just wasted 2 hours trying to do something very similar.
Turns out it only works on device, and doesn't work on the iOS Simulator !
I guess AVFoundation in the simulator is somehow "bridged" to the host Mac's AVFoundation. Unfortunately this API isn't available on OS X 10.8 (according to some commits on WebCore it will be available in OS X 10.9), so for now it doesn't work in the simulator.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With