I'm implementing an AVAssetResourceLoaderDelegate
, and I'm having a bit of trouble getting to to behave correctly. My goal is to intercept any requests made by the AVPlayer, make the request myself, write the data out to a file, then respond to the AVPlayer
with the file data.
The issue I'm seeing: I can intercept the first request, which is only asking for two bytes, and respond to it. After that, I'm not getting any more requests hitting my AVAssetResourceLoaderDelegate
.
When I intercept the very first AVAssetResourceLoadingRequest
from the AVPlayer
it looks like this:
<AVAssetResourceLoadingRequest: 0x17ff9e40,
URL request = <NSMutableURLRequest: 0x17f445a0> { URL: fakeHttp://blah.com/blah/blah.mp3 },
request ID = 1,
content information request = <AVAssetResourceLoadingContentInformationRequest: 0x17ff9f30,
content type = "(null)",
content length = 0,
byte range access supported = NO,
disk caching permitted = NO>,
data request = <AVAssetResourceLoadingDataRequest: 0x17e0d220,
requested offset = 0,
requested length = 2,
current offset = 0>>
As you can see, this is only a request for the first two bytes of data. I'm taking the fakeHttp
protocol in the URL, replacing it with just http
, and making the request myself.
Then, here's how I'm responding to the request once I have some data:
- (BOOL)resourceLoader:(AVAssetResourceLoader *)resourceLoader shouldWaitForLoadingOfRequestedResource:(AVAssetResourceLoadingRequest *)loadingRequest {
//Make the remote URL request here if needed, omitted
CFStringRef contentType = UTTypeCreatePreferredIdentifierForTag(kUTTagClassMIMEType, (__bridge CFStringRef)([self.response MIMEType]), NULL);
loadingRequest.contentInformationRequest.byteRangeAccessSupported = YES;
loadingRequest.contentInformationRequest.contentType = CFBridgingRelease(contentType);
loadingRequest.contentInformationRequest.contentLength = [self.response expectedContentLength];
//Where responseData is the appropriate NSData to respond with
[loadingRequest.dataRequest respondWithData:responseData];
[loadingRequest finishLoading];
return YES;
}
I've stepped through this and verified that everything in the contentInformationRequest
is filled in correctly, and that the data I'm sending is NSData with the appropriate length (in this case, two bytes).
No more requests get sent to my delegate, and the player does not play (presumably because it only has two bytes of data, and hasn't requested any more).
Does anyone have experience with this to point me toward an area where I may be doing something wrong? I'm running iOS 7.
Edit: Here's what my completed request looks like, after I call finishedLoading
:
<AVAssetResourceLoadingRequest: 0x16785680,
URL request = <NSMutableURLRequest: 0x166f4e90> { URL: fakeHttp://blah.com/blah/blah.mp3 },
request ID = 1,
content information request = <AVAssetResourceLoadingContentInformationRequest: 0x1788ee20,
content type = "public.mp3",
content length = 7695463,
byte range access supported = YES,
disk caching permitted = NO>,
data request = <AVAssetResourceLoadingDataRequest: 0x1788ee60,
requested offset = 0,
requested length = 2,
current offset = 2>>
Circling back to answer my own question in case anyone was curious.
The issue boiled down to threading. Though it's not explicitly documented anywhere, AVAssetResourceLoaderDelegate
does some weird stuff with threads.
Essentially, my issue was that I was creating the AVPlayerItem
and AVAssetResourceLoaderDelegate
on the main thread, but responding to delegate calls on a background thread (since they were the result of network calls). Apparently, AVAssetResourceLoader
just completely ignores responses coming in on a different thread than it was expecting.
I solved this by just doing everything, including AVPlayerItem
creation, on the same thread.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With