Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

How to use createPeriodicWave instead of createScriptProcessor and getChannelData

I found some libraries which do instrument synthesizing with the Web Audio API.

One of them (Band.js) uses createOscillator(), in combination with the oscillator type (sine, square, ...) see source.

But it sounds too synthetic (example to listen) I want something which sounds more realistic, but I don't want to use any precompiled soundfonts, so it should be synthesized. It should also work on a mobile device.

So I found another library (musical.js) which uses the first 32 harmonics as an matrix in combination with the createPeriodicWave see source. The timbre is awesome, you can listen to it

As it is written in the comment of the source code, the harmonics are taken from this piano sample file. There are much more sample files of other instruments. I tried to replace the harmonics, even all 2000, but it sounds always like a piano.

There are also some values to adjust and interpolate the harmonics and ADSR values. Maybe they are only optimized for a piano sound?

Then I found another library (guitar-synth) which has a really nice timbre for a guitar, listen to it. But this library don't use any createPeriodicWave API. Instead it uses createScriptProcessor and getChannelData in combination with some "simple" calculations, but nothing like the harmonics at the other library, see source

So my main question

Can the guitar synthesizer be ported to use the createPeriodicWave API? I want to use the guitar timbre in musicaljs. So that I can switch between the piano timbre and guitar timbre.

BTW: Found another library which synthesize sound instruments. Here is the demo and here the source. The sound is also nice, but the musical.js library has a much more beautiful timbre. But it also looks like it uses something similar like getChannelData just encoded as WAVE. It also doesn't work on my mobile device with Android.

like image 372
timaschew Avatar asked Dec 11 '15 14:12

timaschew


1 Answers

This is not an answer, just some thoughts and notes on the topic.

The question itself is interesting for me, I am playing the guitar, but never had a chance (until now) to touch the music in code. I read a bit of theory and played a with musical.js and I feel like I am still far from the solution.

Here some notes, hope they can be useful:

1) I put together the reduced example of code extracted from musicial.js, see web-audio-test.js and web-audio.html.

The audio nodes setup for musicial.js, if I am not mistaken, is this:

there is a "tail" part, which is common for all nodes and stays permanently and "head" - the set of nodes created to play each note

| ------ HEAD (for each note)------ | --- TAIL (for all notes)----------- |
|                                   |                                     |
[ |Oscillator|->|Biquad|->|Gain|-> ] [|Gain|->|Dynamics  |->|Destination| ]
  | Periodic |  |Filter|  |ADSR|              |Compressor|
  | Wave     |

Note: The first oscillator can be doubled by another one to play note frequency + timbre detune.

So we create an oscillator (or two) + filter + ADSR gain to play each note. This way we create a lot of audio nodes. Musicial.js handles this by creating a queue of notes and passed only a limited set of notes to web audio API.

The guitar-synth setup looks much easier and it is just a |Script Processor| -> | Output |. The guitar sound sample is generated from code and fed into the ScriptProcessor node which acts as a sound source (like if you loaded a sample from the file). Not sure, but maybe musical.js could also use this approach to simplify the code.

2) I played with different parameters in musicial.js, but still everything sounds like a piano.

When I apply bass or guitar coefficients from the wave tables examples, it sounds different, but still like a piano (especially higher notes).

The ADSR settings doesn't change this "piano" sound, but I think they can not actually turn sound of the one instrument into sound of another one.

The mult and freq arrays which are used to interpolate the harmonics in real/imag to higher frequency are probably more important, but if we keep them empty (no interpolation), the instrument is still sounds like piano.

What actually need to be changed to tune the sound is still a puzzle for me.

I saw the issue you posted on github, hopefully musicial.js author will be able to at least give some hints.

3) Useful links / tools

Google audio samples doesn't include anything like we need here. The most close is a wavetable synth example where you can switch between different wave tables.

Firefox has a "Web Audio Editor" where you can see a graph of audio nodes, very convenient to learn the audio setup.

4) Practical solution.

You probably already though of this, at the moment I would use both musicial.js and guitar-synth.

A simple wrapper can unify interfaces and provide both piano and guitar instruments.

like image 56
Boris Serebrov Avatar answered Oct 16 '22 15:10

Boris Serebrov