Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Time delay estimation between two audio signals

I have two audio recordings of a same signal by 2 different microphones (for example, in a WAV format), but one of them is recorded with delay, for example, several seconds.

It's easy to identify such a delay visually when viewing these signals in some kind of waveform viewer - i.e. just spotting first visible peak in every signal and ensuring that they're the same shape:


(source: greycat.ru)

But how do I do it programmatically - find out what this delay (t) is? Two digitized signals are slightly different (because microphones are different, were at different positions, due to ADC setups, etc).

I've digged around a bit and found out that this problem is usually called "time-delay estimation" and it has myriads of approaches to it - for example, one of them.

But are there any simple and ready-made solutions, such as command-line utility, library or straight-forward algorithm available?

Conclusion: I've found no simple implementation and done a simple command-line utility myself - available at https://bitbucket.org/GreyCat/calc-sound-delay (GPLv3-licensed). It implements a very simple search-for-maximum algorithm described at Wikipedia.

like image 592
GreyCat Avatar asked Feb 11 '11 09:02

GreyCat


People also ask

How do you find the delay between two signals?

d = finddelay( x , y ) returns an estimate of the delay d between input signals x and y . Delays in x and y can be introduced by prepending zeros. d = finddelay( x , y , maxlag ) uses maxlag to find the estimated delay(s) between x and y .

What is delay time estimation?

Time delay estimation (TDE) is the problem of estimating the time delay between two received signals which have originated from same transmitter. 1. This estimation problem is of fundamental importance in radar signal processing for detecting the presence of targets and identifying radar transmitters.

How do you calculate sample delay?

The formula is simple: Milliseconds times the sample rate = # of samples. In the example, if the delay between a pair of room microphones and a soundboard feed in the record's home studio is 17 milliseconds of delay (based 17 feet of distance), the formula becomes: 17 times 44.1 = 749.7 samples.


2 Answers

The technique you're looking for is called cross correlation. It's a very simple, if somewhat compute intensive technique which can be used for solving various problems, including measuring the time difference (aka lag) between two similar signals (the signals do not need to be identical).

If you have a reasonable idea of your lag value (or at least the range of lag values that are expected) then you can reduce the total amount of computation considerably. Ditto if you can put a definite limit on how much accuracy you need.

like image 193
Paul R Avatar answered Oct 15 '22 12:10

Paul R


A very straight forward thing todo is just to check if the peaks exceed some threshold, the time between high-peak on line A and high-peak on line B is probably your delay. Just try tinkering a bit with the thresholds and if the graphs are usually as clear as the picture you posted, then you should be fine.

like image 40
Roy T. Avatar answered Oct 15 '22 12:10

Roy T.