Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

AVPlayer loading AVAsset from file that is appended simultaneously by external source (for macOS and iOS)

I have a question concerning the use of AVFoundation’s AVPlayer (probably applicable to both iOS and macOS). I am trying to playback audio (uncompressed wav) data that come from a channel other than the standard HTTP Live Streaming.

The case:
Audio data packets come compressed in a channel along with other data the app needs to work with. For example, video and audio come in the same channel and get separated by a header.
After filtering, I get the audio data and decompress them to a WAV format (does not contain headers at this stage).
Once the data packets are ready (9600 bytes each for 24k, stereo 16bit audio), they are passed to an instance of AVPlayer (AVAudioPlayer according to Apple is not suitable for streaming audio).

Given that AVPlayer (Item or Asset) does not load from memory (no initWithData:(NSData)) and requires either a HTTP Live Stream URL or a file URL, I create a file on disk (either macOS or iOS), add the WAV Headers and append the uncompressed data there.

Back on the AVPlayer, I create the following:

AVURLAsset *audioAsset = [[AVURLAsset alloc] initWithURL:[NSURL fileURLWithPath:tempAudioFile] options:nil];
AVPlayerItem *audioItem = [[AVPlayerItem alloc] initWithAsset:audioAsset];
AVPlayer *audioPlayer = [[AVPlayer alloc] initWithPlayerItem:audioItem];

adding KVOs and then try to start playback:

[audioPlayer play];

The result is that the audio plays for 1-2 seconds and then stops (AVPlayerItemDidPlayToEndTimeNotification to be exact) while data continue to append to file. Since the whole thing is on loop, [audioPlayer play] starts and pauses (rate == 0) multiple times.

The whole concept in a simplified form:

-(void)PlayAudioWithData:(NSData *data) //data in encoded format
{
    NSData *decodedSound = [AudioDecoder DecodeData:data]; //decodes the data from the compressed format (Opus) to WAV
    [Player CreateTemporaryFiles]; //This creates the temporary file by appending the header and waiting for input.

    [Player SendDataToPlayer:decodedSound]; //this sends the decoded data to the Player to be stored to file. See below for appending.

    Boolean prepared = [Player isPrepared]; //a check if AVPlayer, Item and Asset are initialized
    if (!prepared)= [Player Prepare]; //creates the objects like above
    Boolean playing = [Player isAudioPlaying]; //a check done on the AVPlayer if rate == 1
    if (!playing) [Player startPlay]; //this is actually [audioPlayer play]; on AVPlayer Instance
}

-(void)SendDataToPlayer:(NSData *data)
{
    //Two different methods here. First with NSFileHandle — not so sure about this though as it definitely locks the file.
    //Initializations and deallocations happen elsewhere, just condensing code to give you an idea
    NSFileHandle *audioFile = [NSFileHandle fileHandleForWritingAtPath:_tempAudioFile]; //happens else where
    [audioFile seekToEndOfFile];
    [audioFile writeData:data];
    [audioFile closeFile]; //happens else where

    //Second method is 
    NSOutputStream *audioFileStream = [NSOutputStream outputStreamWithURL:[NSURL fileURLWithPath:_tempStreamFile] append:YES];
    [audioFileStream open];
    [audioFileStream write:[data bytes] maxLength:data.length];
    [audioFileStream close];
}

Both NSFileHandle and NSOutputStream make fully working WAV files played by QuickTime, iTunes, VLC etc. Also, if I bypass the [Player SendDataToPlayer:decodedSound] and have the temp audio file preloaded with a standard WAV, it also plays normally.

So far, there are two sides: a) I have the audio data decompressed and ready to play b) I save the data properly.

What I am trying to do is send-write-read in a row. This makes me think that saving the data to file, gets exclusive access to the file resource and does not allow AVPlayer to continue playback.

Anyone having an idea on how to keep the file available to both NSFileHandle/NSOutputStream and AVPlayer?

Or even better… Have AVPlayer initWithData? (hehe…)

Any help is much appreciated! Thanks in advance.

like image 702
Pericles Avatar asked Oct 04 '16 14:10

Pericles


1 Answers

You can use AVAssetResourceLoader to pipe your own data and metadata into an AVAsset, which you can then play with AVPlayer, in effect making an [[AVPlayer alloc] initWithData:...]:

- (AVPlayer *)playerWithWavData:(NSData* )wavData {
    self.strongDelegateReference = [[NSDataAssetResourceLoaderDelegate alloc] initWithData:wavData contentType:AVFileTypeWAVE];

    NSURL *url = [NSURL URLWithString:@"ns-data-scheme://"];
    AVURLAsset *asset = [[AVURLAsset alloc] initWithURL:url options:nil];

    // or some other queue != main queue
    [asset.resourceLoader setDelegate:self.strongDelegateReference queue:dispatch_get_global_queue(DISPATCH_QUEUE_PRIORITY_DEFAULT, 0)];

    AVPlayerItem *item = [[AVPlayerItem alloc] initWithAsset:asset];
    return [[AVPlayer alloc] initWithPlayerItem:item];
}

which you can use like so:

[self setupAudioSession];

NSURL *wavUrl = [[NSBundle mainBundle] URLForResource:@"foo" withExtension:@"wav"];
NSData *wavData = [NSData dataWithContentsOfURL:wavUrl];

self.player = [self playerWithWavData:wavData];

[self.player play];

The thing is, AVAssetResourceLoader is very powerful (unless you want to use AirPlay), so you can probably do better than feeding the audio data to the AVPlayer in one lump - you could stream it into the AVAssetResourceLoader delegate as it becomes available.

Here's the simple "one lump" AVAssetResourceLoader delegate. To modify it for streaming it should be enough to set a longer contentLength than the amount of data that you currently have.

Header file:

#import <Foundation/Foundation.h>
#import <AVFoundation/AVFoundation.h>

@interface NSDataAssetResourceLoaderDelegate : NSObject <AVAssetResourceLoaderDelegate>

- (instancetype)initWithData:(NSData *)data contentType:(NSString *)contentType;

@end

Implementation file:

@interface NSDataAssetResourceLoaderDelegate()

@property (nonatomic) NSData *data;
@property (nonatomic) NSString *contentType;

@end

@implementation NSDataAssetResourceLoaderDelegate

- (instancetype)initWithData:(NSData *)data contentType:(NSString *)contentType {
    if (self = [super init]) {
        self.data = data;
        self.contentType = contentType;
    }
    return self;
}

- (BOOL)resourceLoader:(AVAssetResourceLoader *)resourceLoader shouldWaitForLoadingOfRequestedResource:(AVAssetResourceLoadingRequest *)loadingRequest {
    AVAssetResourceLoadingContentInformationRequest* contentRequest = loadingRequest.contentInformationRequest;

    // TODO: check that loadingRequest.request is actually our custom scheme        

    if (contentRequest) {
        contentRequest.contentType = self.contentType;
        contentRequest.contentLength = self.data.length;
        contentRequest.byteRangeAccessSupported = YES;
    }

    AVAssetResourceLoadingDataRequest* dataRequest = loadingRequest.dataRequest;

    if (dataRequest) {
        // TODO: handle requestsAllDataToEndOfResource
        NSRange range = NSMakeRange((NSUInteger)dataRequest.requestedOffset, (NSUInteger)dataRequest.requestedLength);
        [dataRequest respondWithData:[self.data subdataWithRange:range]];
        [loadingRequest finishLoading];
    }

    return YES;
}

@end
like image 70
Rhythmic Fistman Avatar answered Oct 11 '22 23:10

Rhythmic Fistman