On Thursday 13 November 2003 16:48, J_Zar wrote:
I' ve done some tests on a bunch of songs in
different compressed formats
( samplerate = 44100 ): Mp3 and Ogg. For the Mp3 format I tested various
bitrates and I find out that on the playback phase this format has a value
of 26.12 milliseconds/frame ( meaning that every frame cover 26.12 ms! ).
My ask: is there a general algorythm to calculate the
ms/frame value for
all conpressed formats? Could someone confirm my values? Are this values
affected from some parameters ( I think surely samplerate... )? Why
different values for Ogg and Mp3? Ogg will be affected by bitrate?
An MPEG frame always contains 1152 PCM frames, as per the standard. This is
true of Layers One, Two and Three. Thus, the time length of an MPEG frame
would be:
l=1152/fs
where
l = Length of frame in mS
fs = Sample rate in kHz
Thus, for your example case, the calculated length would be:
1152/44.1 = 26.1 mS
thus yielding good agreement with your measured results.
I don't know how it is with Vorbis, but I'd suspect something similar. See:
http://www.xiph.org/
Cheers!
|-------------------------------------------------------------------------|
| Frederick F. Gleason, Jr. | Director of Broadcast Software Development |
| | Salem Radio Labs |
|-------------------------------------------------------------------------|
| Logic is a way to go wrong with confidence. |
| --Robert Heinlein |
| "The Notebooks of Lazarus Long" |
|-------------------------------------------------------------------------|