Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Playing and recording audio in sync with getUserMedia/Web Audio API

I'm currently working on a web-based collaborative recording platform for musicians, something like a basic DAW ported to the web (with extra social/sharing features). Anyway, my goal is to make it 100% flash-free, so I've been reading a lot about HTML5 and, in particular, the Web Audio API (this book helped a lot, btw).

To record audio from the user's microphone, using getUserMedia(), I made a custom version of RecorderJS. In a nutshell, I'm routing the output from getUserMedia() to a ScriptProcessorNode which, every 4096 samples, writes the contents of the inputBuffer to an array that is later exported to a PCM WAV file. So far, it works fine.

The problem is that the start of the recording procedure involves two things: playing all the previously recorded tracks, so the musician has a reference to play on top of, and starting the actual recording (writing the buffer to the array, that is).

Although there is no audible latency or delay from the sound of the microphone when the user is recording, when the recording ends and all the tracks are played, the newly recorded track has a slight delay.

What can be causing this? What are the possible solutions?

I thought I could find the time difference between both events by also sending the playback to the same processor node and then find out when do they actually begin, to compensate any delay. For this, I would need to have the ScriptProcessorNode receiving, for example, getUserMedia stuff on channels 1 and 2, and playback on channels 3 and 4, but I can't make this work. I tried routing both sources to the processor node and I also tried with a Merger/Splitter, but nothing seems to work. They all reach the processor node on channels 1 and 2, while 3 and 4 come empty.

Sorry if this is off-topic or doesn't contain actual code (which I am more than happy to provide if necessary), but there is not much stuff done on this area and so any ideas would be very welcome.

Thanks in advance!

like image 442
user1276108 Avatar asked Jun 23 '13 17:06

user1276108


Video Answer


2 Answers

You can look at how Audacity does latency correction and take inspiration from it.

Basically, you output a sound (a click track for example) and record it at the same time, in order to see how much milliseconds it takes for the sound to be played and recorded.

This amount of time is the latency you need to compensate on playback.

In other words, after you record a track you need to shift the track to the amount of latency you need to compensate to, so it will play in sync with the previously recorded tracks.

Each system has its own latency so you can't just measure once for all. You need to put a feature in your program to allow users to do latency calibration in the easiest possible way.

Two examples of latency calibration on existing software:

  • Audacity Latency Test
  • Ardour automatic latency detection Ardour automatic latency detection
like image 110
Giacomo Avatar answered Nov 15 '22 03:11

Giacomo


I'm trying to do the same thing - HTML5 multitrack recorder. The webkit enabled stuff is just not ready for prime time. Recorder.js is very promising. http://carolwith.me is a flash based multitrack recorder that does exactly what I want (except I too want HTML5 not Flash). Have a look - it's unbelievably goofy! If you play with it it doesn't really sync either. An Algorithm setup to do a count-in (pre-roll) and then set the subsequently recorded tracks against it was also sought by me. I found a guy who had a possible solution but he abandoned it and then took his site down. Lets keep trying! Been at it since 2010 and I'm sure the hardest part (getusermedia) will become a standard.

like image 38
Hans Delbrück Avatar answered Nov 15 '22 05:11

Hans Delbrück