I want to generate a sound wave programmatically and play it with AVAudioPlayer. I have the code to encode my waveform as linear PCM, 44100Hz, mono, 8 bits per sample.
I am not clear on what kind of envelope I need to wrap around this buffer so that AVAudioPlayer recognizes it as PCM.
PCM is just a digital representation of an analog audio signal. Unfortunately, it doesn't encapsulate any of the metadata about the audio - channels, bit depth, or sample rate - all necessary to properly read said PCM data. I would assume AVAudioPlayer would accept this PCM data wrapped in an NSData object as long as you were able to set those variables manually in the AVAudioPlayer object. Unfortunately, those variables are read only, so even though the documentation says AVAudioPlayer can handle anything that Core Audio can handle, it has no way to handle raw LPCM data.
As stated by zoul, I would imagine that the easiest way to go about this is throwing in a WAV header, since the header informs AVPlayer of the above necessary variables. It's 44 bytes, is easily mocked up, and is defined nicely - I used the same definition given above to implement wav header encoding and decoding. Also, it's just prepended to your unmodified LPCM data.
Maybe adding a WAV header would help?
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With