News:

SMF for DIYStompboxes.com!

Main Menu

Behringer DD600 24bit CODEC

Started by siro78, April 13, 2011, 05:04:10 AM

Previous topic - Next topic

siro78

Hi everyone.
I have disassembled a behringer DD600 new generation digital delay with RSM (real sound modeling) system and I  have notice that the audio codec on board is a AKM4555VT
http://www.asahi-kasei.co.jp/akm/en/product/ak4555/ak4555_f01e.pdf

This is a 20bit CODEC not a 24bit CODEC!!! Behringer speak about 24bit conversion on this pedal. Is a fraud???

MetalGuy

Even a 20 bit codec is too much for a delay.

cloudscapes

Yes it's dishonest, but I doubt you'd hear the difference. There are probably fidelity bottlenecks all over the design. The couple times I had Behringer gear, the noise floor wasn't exactly silent.
~~~~~~~~~~~~~~~~~~~~~~
{DIY blog}
{www.dronecloud.org}

PRR

> Is a fraud???

Is not a fraud, though not real clear.

> Behringer speak about 24bit conversion

No, they say "24-bit high-resolution stereo delay/echo/panning".

Think back to basic arithmetic. When you add two 3-digit numbers you can get a 4-digit result:

900 + 900 = 1800 (3 digit, 3 digit inputs, 4 digit result)

When you multiply two 3-digit numbers you can get a 6-digit result:

400 * 400 = 160000

The "delay/echo/panning" is a LOT of addition and some multiplication (or division, to set lower level).

If the input is 20-bit and you do more than a little manipulation, you can have 40-bit intermediate answers. Even though the output is only 20-bit, if you do many-many operations and round-off many intermediate results at 20 bits, you get "noise" above the 20th bit.

The early DAWs were 16 bit throughout. For a simple transfer or one gain-change, 16 bit math is ample. When you do more complicated manipulations such as EQ or reverb, all the round-off errors accumulate and give a very nasty sound.

The digital processing "should" be twice as deep as the useful input/output resolution. Since this is expensive, some compromise may be necessary.

> Even a 20 bit codec is too much for a delay

It is generally "plenty" for any analog interface. 20 bits is 120 db signal to noise ratio. Very few analog links have 120 db between self-noise and overload. No listening room has 120 db useful range: 20 db SPL is a very low background noise, 20+120= 140 db SPL is beyond deafening.

Realistically we rarely need more than 16 bits (96 db) on the analog side. Very very VERY often we only have 12 bits (72 db) of useful analog signal. For very elaborate processing we might like 32-bit arithmetic, but many useful sounds can be processed with 24 bit arithmetic very very well.

> This is a 20bit CODEC

Yes, it has 20 bit slots. However 89 db dynamic S/N is no better than 16 bit quality. Taking the maximum level as 0.5V, 89 db implies analog noise near 18uV, which is very typical of non-special analog ports. "20 bits" implies noise near 0.5uV, which is incredibly low and unlikely.

And for fifty bucks..... it is a fabulous lot of bits for the buck.
  • SUPPORTER

tommy.genes

But since it is a delta-sigma converter, what exactly does "20 bit" resolution mean? The output is ultimately a stream of serial bits, not parallel data lines. Does the device have sufficient oversampling to offer a resolution equivalent to 20bit PCM? But with a max sampling frequency of 50kHz, that doesn't seem likely either, does it?

???

-- T. G. --
"A man works hard all week to keep his pants off all weekend." - Captain Eugene Harold "Armor Abs" Krabs

PRR

> output is ultimately a stream of serial bits, not parallel data lines. ... max sampling frequency of 50kHz

Read the DATASHEET. Yes, it isn't clear, it assumes you've worked with such chips a lot. But there's clues.

The Master Clock can go into MHz.

The digital interface is serial; has to be, there's only 16 pins on the chip! 20 bits at 50K samples per second is only a 1MHz stream, 2MHz for stereo.

Delta-sigma may be the internal conversion technique. This is entirely appropriate for signal you will listen to (nuclear scintillation data maybe not so good). However the external data is clearly "words", output MSB-first. This just means it has a register which holds the last value, adds the delta-sigma result, and reels the bits out serially.
  • SUPPORTER

tommy.genes

Thanks.

I did well enough in college studying PCM, Nyquist and all that. It's only when I started taking apart delay pedals that I noticed that some of them were dealing with serial data and not parallel data lines. That led me to learn about delta-sigma and pulse density modulation, but I haven't fully wrapped my brain about it yet. It now seems potentially even more confusing in that serial data could be either PDM or serially-coded PCM.

-- T. G. --
"A man works hard all week to keep his pants off all weekend." - Captain Eugene Harold "Armor Abs" Krabs

PRR

#7
> dealing with serial data and not parallel

In books, it is convenient to imply a 16-bit bus.

On PCB, it is a LOT cleaner to run one wire.

The trade-off is that this wire has to be 16 times faster.

But at ~~50KHz sample rate, for wires from an inch to a foot, the ~~1MHz datarate is not a hard problem.

And for wires over a few feet, parallel is not only costly, but small path-length differences mean all 16 bits and the clock don't arrive at the same instant, "skew". Not so bad for 6-foot 8-bit printer cables working well under 1MHz. Cost for 50 feet of 12-conductor is brutal, and even near 100KHz you can have skew errors and pattern-dependent errors.

8, 16, even 32 flipflops to shift-register parallel-serial-parallel are, on modern chips, far cheaper than 8 16 32 contact connectors. Not to mention copper wire.

> serial data could be either PDM or serially-coded PCM.

Could be ANY thing.

But more than likely, the next chip would like simple sequential 16-bit (or whatever) words, as if reading an MS WAV file. And is willing to provide a register to take the data on one pin to de-serialize for its internal use.

Serial gets into problems of framing and clocking. In a run of apparently random bits, where do "words" start? This is an old-old-old problem with lots of solutions. In Teletype there are 6 data bits and 2 stop-bits. Internally there is a finger which jams a gear if it stops for more than 1.5 bit-times. In random data it could catch randomly, but when getting TeleType data it will tend to catch the 2-bit stop gap, then be in-sync for the next character-word. Now that "gears and fingers" are dots on silicon, far trickier framing and sync is possible. And cheap.

> taking apart delay pedals that I noticed

Audio generally today moves serially, except maybe from RAM through CPU to sound-card. Or inside processor chips where you want to crunch 16- or 24-bit numbers between each sample. (There have been "1-bit" processors but awful slow to gobble wide data.)

FWIW, "all" networking (telegraph, TeleType, 2400 phone modem, AppleTalk, original EtherNet, Token-Ring, T1, cheapernet, UTP hubbed "ethernet") is serial data.

So are essentialy all disk drives. The ST-506 interface presented 8-bit words to the CPU bus, but it collected serial bits from the head-decoder into a register.

RAM is often built and studied parallel, and parallel makes sense for very fast very short data-paths. Skew is an issue, annoying to the PCB layout. Some things we handle as "funny RAM such as thumb-drives are clearly serial at the USB jack, could be serial inside for all we know or care.

The standard digital interface from CD-ROM drives is serial, and is the model for stuff like SP/DIF audio interfaces. And USB, though USB has grown far beyond that.

Seeing as how there is a parallel register inside and a serial path in/out, and it isn't a HIGH-spec converter, "Delta Sigma" seems like just buzz-words. Why do we care if it is SAR, D/S, 65 thousand flash-comparators, ramp-integration, or a teeny elf with a fast voltmeter?
  • SUPPORTER