Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Web Audio API - difference between PeriodicWave and looping AudioBufferSourceNode to achieve a wavetable?

I'm using two techniques to create a wavetable synthesizer sound :

1 - Loop an AudioBufferSourceNode which contains a single waveform cycle

// Load a single cycle short wave file, then :
  audioContext.decodeAudioData(audioData, function(buffer) {
     source.buffer = buffer;
     source.loop = true;
   },

2 - Create a PeriodicWave and provide it with fourier coefficients ( using coefficients found on the web, i.e. (0,1) for a sine wave, (0,.1,.4, .6, ...) for more complex waves.

 var wave = ac.createPeriodicWave(real, imag); 
 OscillatorNode.setPeriodicWave(wave);

What are the pros and cons of using one technique over the other ? Do these techniquees yield very different audible results ?

I have a demo here which plays both approaches : http://davedave.us/wavetable-synth/

My code is work in progress, but it's here : https://github.com/looshi/wavetable-synth

like image 715
looshi Avatar asked Oct 30 '22 13:10

looshi


1 Answers

If the sample rate of the audio file you're loading and the sample rate of the audio context are the same, then there isn't really much difference between the two. The main difference I can think of is that the buffer approach can produce glitches if the first and last samples of the loop differ significantly. This won't happen with the periodic wave unless you make it that way.

If you don't change the frequency, the audio buffer might require somewhat less CPU to produce the audio.

For high fundamental frequencies, the periodic wave will probably sound a bit different because it is forced to be band-limited. An audio buffer doesn't have that restriction.

like image 134
Raymond Toy Avatar answered Nov 10 '22 16:11

Raymond Toy