Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

What is a good set of constraints for lowest latency audio playback/monitoring with the MediaStream Recording API?

I'm currently spiking out a music application with HTML5/JS and am attempting to achieve the lowest latency I can with the MediaStream Recording API. The app allows a user to record music with a camera and microphone. While the camera and microphone are on, the code will allow the user to hear and see themselves.

At the moment I have:

  const stream = await navigator.mediaDevices.getUserMedia(
    {
      video: true,
      audio: {
        latency: {exact: 0.003},
      }
    }
  );

  // monitor video and audio (i.e. show it to the user)
  this.video.srcObject = stream;
  this.video.play();

If I go any lower on the latency requirement, I get an OverConstrained error. The latency is okay (better than the default) but still not great for the purposes of hearing yourself while you're recording. There is a slight, perceptible lag from when you strum a guitar and hear it in your headphones.

Are there other optimizations here I can make to achieve better results? I don't care about the quality of the video and audio as much, so maybe lowering resolution, sample rates, etc. could help here?

like image 394
Tomek Avatar asked Aug 20 '19 18:08

Tomek


People also ask

What is a good latency for audio recording?

While some keyboardists claim to hear a 5ms discrepancy in their performances, the vast majority of musicians are unlikely to worry about 10ms, and many should find a latency of 23ms or more perfectly acceptable with most sounds, especially pads with longer attacks.

What is low latency recording?

Low Latency Mode bypasses plug-ins as needed, so the amount of latency doesn't exceed the Limit setting in the Plug-in Latency section of the General Audio preferences of Logic Pro. Low latency mode is especially useful when you want to record a software instrument in a project that includes latency-inducing plug-ins.

What affects audio latency?

Latency can be caused by many factors, including both analog-to-digital and digital-to-analog conversion, buffering, digital signal processing, transmission time and the audio speed in the transmission medium. This delay can be a critical performance consideration in several pro audio applications.


1 Answers

latency of 0.003 is a very, very low latency (3ms) and not noticeable by human beings ear.

Said that, the latency cannot be 0, when we talk of digital audio. Although you set a very low value, it is not guaranteed that the latency actually match for various reason, in case the system can't match the latency the promise will be rejected.

As you can read here in docs:

Constraints which are specified using any or all of max, min, or exact are always treated as mandatory. If any constraint which uses one or more of those can't be met when calling applyConstraints(), the promise will be rejected.

Notice: different browsers and differents OS behave differently.

Chrome

Chrome, in some canary build introduced a low latency feature called Live Web Audio Input:

// success callback when requesting audio input stream
function gotStream(stream) {
    window.AudioContext = window.AudioContext || window.webkitAudioContext;
    var audioContext = new AudioContext();

    // Create an AudioNode from the stream.
    var mediaStreamSource = audioContext.createMediaStreamSource( stream );

    // Connect it to the destination to hear yourself (or any other node for processing!)
    mediaStreamSource.connect( audioContext.destination );
}

navigator.getUserMedia = navigator.getUserMedia || navigator.webkitGetUserMedia;
navigator.getUserMedia( {audio:true}, gotStream );

Here you can see in action some demos taking advantage of that feature:

  • Live vocoder
  • Live input visualizer
  • Pitch detector
like image 100
Mosè Raguzzini Avatar answered Oct 01 '22 20:10

Mosè Raguzzini