WHAT ABOUT ANALOG DELAY WITH DIGITAL TEMPO?

Started by Mugshot, February 10, 2010, 10:32:50 PM

Previous topic - Next topic

Mugshot

ok, i'll rephrase it. :D Potentially dumb question: anybody thought of an analog delay with a digital interface, i mean instead of turning a knob to set the speed, this one you press numbers, say to set the delay time to 120bps, you just press 1-2-0. i know it's hard to describe what i have in mind, the words wont come out right  :icon_biggrin:

i am what i am, so are you.

R.G.

R.G.

In response to the questions in the forum - PCB Layout for Musical Effects is available from The Book Patch. Search "PCB Layout" and it ought to appear.

Br4d13y

i would have to say the marriage of LCD display, keypad, and cocentric pot would probably be the way to go
freedom is the freedom to say 2+2=4

Mugshot

im thinking of an LCD to display the BPM and a numeric keypad that will allow for a faster entry of preferred time, something that controls only the speed aspect of the delay. that'd be cool :D
i am what i am, so are you.

G. Hoffman

Well, there are at least two BBD delays on the market that have tap tempo (the Seymour Duncan delay, and one other I can't remember at the moment, plus any I'm not aware of!), so it would certainly be possible.  I'm also pretty sure that a member of this board had up some pictures somewhere of something similar.  If I remember correctly, it required a large box full of SOP parts. 


Gabriel

cpm

BBD delay time is based on a fixed rate clock signal, so it can be calculated to the exact ms or bpm.

So your problem now is to build all that hardware, software, and interactive interface to generate a desired frequency, at this point it doesnt really matter if its a delay what you are controlling

RG smiles, he knows...  :o

StephenGiles

Quote from: cpm on February 11, 2010, 08:34:17 AM
BBD delay time is based on a fixed rate clock signal, so it can be calculated to the exact ms or bpm.

So your problem now is to build all that hardware, software, and interactive interface to generate a desired frequency, at this point it doesnt really matter if its a delay what you are controlling

RG smiles, he knows...  :o


and by the time you've decided how many bpm to set, the band have already finished the next song!
"I want my meat burned, like St Joan. Bring me pickles and vicious mustards to pierce the tongue like Cardigan's Lancers.".

Mark Hammer

Part of the smile is because the circuitry to do that would likely be 2-3 times more extensive than what was required to produce the actual delay.

Though now that I think of it, there is no reason why the clock itself would have to be analog.  It's not like the clock frequencies used to push an analog sample through a couple of thousand FET stages are so high that a PIC could not be used to generate that clock pulse, right?  So, read a trio of rotary switches, and output a clock pulse that produces the HF clock pulse required to produce that delay.  I imagine the PIC itself might not be ideally suited to producing clock pulses that stay nice and crisp in the face of all that input capacitance on the BBD, but that's what buffers are for.

potul

I'm with Mark. A PIC would be suitable to do that digital part of the job without tons of circuitry associated.
What range of frequencies would be required to be generated by the PIC?



R.G.

Quote from: potul on February 11, 2010, 12:49:43 PM
I'm with Mark. A PIC would be suitable to do that digital part of the job without tons of circuitry associated.
What range of frequencies would be required to be generated by the PIC?
Time delays are simple, guys. You have a delay line, either "analog" (really, sampled analog) or digital. Each bucket in the delay line is passed down the line on each clock cycle. (to the dedicated fact-checkers: yes, I know that clocks may be more complex than one-for-one; let the guy understand the basics first, OK?)

So the total delay is the clock period (one divided by the clock frequency) times the number of delay steps available. The venerable SAD1024 had either 512 or 1024 steps available, depending on how you hooked it up. If you fed it a clock of 100kHz, then each step is 1/100,000 or ten microseconds. Ten microseconds times 1024 is 0.01024S or ten milliseconds. Feed it a clock of 1MHz, and each step is one microsecond, so the delay is one thousand (roughly) times a microsecond, or one millisecond.

The frequencies are then the number of steps you have available divided by the time you want to delay. Want 10 milliseconds and have 1024 steps? That's 1024/0.01 = 102400Hz, or about the 100kHz we used.
R.G.

In response to the questions in the forum - PCB Layout for Musical Effects is available from The Book Patch. Search "PCB Layout" and it ought to appear.

potul

Thanks for the explanation. So the maximum needed clock freq will be determined by the min delay time we want to be able to handle, right?
Where would the clock freq minimum be? I guess that something below 20 Khz could start to be adding audible noise (??)

rustypinto

PICs! Wonderful/ horrible little devices...

The frequency range is whatever you want it to be, but also limited to the BBD's (if thats what you end up using) USABLE limits. When i coded a controller for an analog delay, i did use RG's method. I also made this graph to help me wrap my mind around the system:



Note: this is for three 4096-stage BBDs. I could not apply this data (in the controller) to two 4096-stage BBDs and get the same, accurate results over the applicable delay range (which in my case is ~65ms-700ms).

There is also a tremendous amount of info. out there for PICs. I'll see if i can dig up the LCD/ pushbutton controller example. There are also a tremendous amount of different PICs to choose from, so carefully select your device for the anticipated I/O so you buy the right programmer, etc.
  • SUPPORTER

G. Hoffman

Quote from: potul on February 11, 2010, 01:26:17 PM
Thanks for the explanation. So the maximum needed clock freq will be determined by the min delay time we want to be able to handle, right?
Where would the clock freq minimum be? I guess that something below 20 Khz could start to be adding audible noise (??)

If I'm reading this right, I would say yes.  But the other thing I'm noticing is that as your clock rate goes down and your delay length goes up, your "sample rate" is going down, which should mean problems with your frequency response before you start getting into audible noise.  Of course, many people are rather fond of the high end roll off from BBD delays, so that probably OK, but I hadn't noticed before that BBD's are going to be brighter as the speed goes down.  At least, if I'm reading this right.



Gabriel

Mark Hammer

MXR implemented a brilliant approach in their old green analog delay by using switched-resistor filters that were driven from the same clock, such that the filter corner frequency tracked the clock rate and delay time, extracting maximum usable bandwidth at any given delay.  Some other delay units of that era would pick a lowpass point that would be effective at the longest delays (where risk of audible clock whine was greatest), and sacrifice bandwidth at short delays.  While it may well have been a result of misadjustment (BBD output balance, possibly), I seem to recall trying out some units way back when that aimed for more bandwidth to make the short delays as bright as possible, and let the user deal with the whine at longer delays by turning the treble down on the amp or EQ pedal.