I am wondering on the relationship between a block of samples and its time equivalent. Given my rough idea so far:
Number of samples played per second = total filesize / duration.
So say, I have a 1.02MB file and a duration of 12 sec (avg), I will have about 89,300 samples played per second. Is this right?
Is there other ways on how to compute this? For example, how can I know how much a byte[1024] array is equivalent to in time?
A typical digital audio recording has as many as 44,100 samples every second. However, it is not unusual to see 96,000 samples a second with some digital audio formats.
Overall, recording at 44.1kHz is a safe option that will provide you with high-quality recordings, regardless of the type of audio project you're working on. 44.1kHz is the most common sample rate for music CDs. It captures the entire audible frequency spectrum accurately.
Why is 44.1 kHz the standard sample rate in consumer audio? 44.1 kHz, or 44,100 samples persecond, is perhaps the most popular sample rate used in digital audio, especially for music content.
The sampling frequency or sampling rate, fs, is the average number of samples obtained in one second, thus fs = 1/T. Its units are samples per second or hertz e.g. 48 kHz is 48,000 samples per second. Reconstructing a continuous function from samples is done by interpolation algorithms.
Generally speaking for PCM samples you can divide the total length (in bytes) by the duration (in seconds) to get the number of bytes per second (for WAV files there will be some inaccuracy to account for the header). How these translate into samples depends on
If you know 2) and 3) you can determine 1)
In your example 89300 bytes/second, assuming stereo and 16 bits per sample would be 89300 / 4 ~= 22Khz sample rate
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With