Ridiculously long delay time from short delay chip?

Started by varialbender, January 04, 2006, 11:17:15 PM

Previous topic - Next topic

varialbender

I was reading the specs for a few delay chips, and noticed that the clock frequency on the pt2399 doesn't have a minimum, whereas the analog ones I checked out, like the MN3007, have a minimum clock frequency. I'm hoping to get the pt2399 to give me some ridiculously long delay times, like 10 seconds maybe, and I understand that the quality will suffer greatly, and that's part of my intent.
Can I do this with a larger resistor at pin6? There's mention that it wants 10-50k, but it's hard to tell how big an issue this is. Will this damage the chip? Will it work?
Are there other ways to do this?
Other chips that allow this?

Thanks a bunch

RJ

Hi varialbender,

The delay time for the PT2399 chip is determined by the resistance to ground from pin 6....I've messed around with it on the breadboard lots and found that a 100k resistor is good for about a second of grainy delay, and about 300k made for very noisy ~3 second repeats. 

Depending on how you set up the filter sections of the chip you can get some cool very lo-fi long delays....or crazy out of control noisy delays.... all sounds cool in my opinion! 

-Ryan

soggybag

I wonder if it would be possible to chain two chips together?

Peter Snowberg

That chip likely uses DRAM for a delay buffer. If you don't perform a "refresh" operation every so often on DRAM it will literally forget the contents. Lowering the clock that far will almost surely push the timing to the point where refresh isn't happening fast enough and the DRAM start to fail. That might make for some really cool noise.  :icon_twisted:
Eschew paradigm obfuscation

varialbender

Cool.
No chance of that damaging the chip?
Also, is there anyway to increase the refresh rate?

Is this type of stuff doable with, say, PIC chips?

Peter Snowberg

>> No chance of that damaging the chip?
Nope. None. :)

>> Also, is there anyway to increase the refresh rate?
Nope. The clock is internally divided, so the refresh rate is fixed to the sample rate/delay time.

>> Is this type of stuff doable with, say, PIC chips?
Nope. The problem is that these are analog signals and PICs live in the world of digital. Once you add A/D and D/A, a PIC is no longer the right tool for the job. That's partially why chips like the PT series are still available.
Eschew paradigm obfuscation

Hal

why?  Are A/D operations too slow?  I was thinking of trying to put together something like a crude PIC delay chip, basically an 8-bit D/A, going as fast as it can, and just rolling the memory forward, and pushing out the last one....

Peter Snowberg

Bit banging on a PIC is a little too slow for using a 16 bit serial sigma-delta CODEC.
If you want a 16 bit serial sigma-delta CODEC, then a PIC is not the right choice.

If you use a parallel D/A and A/D along with parallel memory, all being 8 bits wide, then yes a PIC will work just fine. All you need are the PIC, an 8 bit A/D, 64K of SRAM, a couple of 74HCT574 latches, and a pile of 1 and 2K resistors to get 1.3 seconds at 48K or 2.6 seconds at 24K. You'll also want some kind of high frequency variable clock or a way of adjusting the speed in code with some pushbuttons.

You can also use DRAM with some trickier programming. Old 30 pin SIMMs are easy to work with a they're a dime a dozen. 256Kx8 at 48K is 5.3 seconds and 1M gives you 21.3 seconds. ;)

With limited bit width, faster is much better.

There's no reason you can't expand the above system to 12 or 16 bits with the same PIC code. ;)

Eschew paradigm obfuscation

bioroids

Quote from: soggybag on January 04, 2006, 11:41:46 PM
I wonder if it would be possible to chain two chips together?

Yes it is. Just feed the output of one to the input of the other. Of course you'll be a/d'ing and d/a'ing and filtering twice

Luck

Miguel
Eramos tan pobres!

gez

Quote from: Hal on January 05, 2006, 01:50:38 AM
I was thinking of trying to put together something like a crude PIC delay chip, basically an 8-bit D/A, going as fast as it can, and just rolling the memory forward, and pushing out the last one....

Yeah, it's doable, but will be lo-fi.  Everyday Practical Electronics had a 'Dalek imitator' (or something like that) in one of its recent issues.  Part of the effect used a PIC based 'echo' using a recent generation chip.  Really simple circuit and really short code.  I'll see if I can find it and outline the delay part (though I won't post a schemo), and you can download the code  from the EPE website.  Give me a day or so.

EPE also did a chorus/flange/delay effect using a PIC and external memory.  All these things are 8 bit though, so a little lo-fi.
"They always say there's nothing new under the sun.  I think that that's a big copout..."  Wayne Shorter

gez

OK, the article was called the 'Cybervox'.  It uses a PIC18F252, which has an on board A-to-D converter.  This is fed from a buffered and filtered input stage (using a dual op-amp).  The PIC reads out after 1536 samples and the sampling rate is adjusted to control length of delay, so similar to a BBD in this respect.  A TLC7524 converts D-to-A and this is further buffered by an op-amp follower that can swing rail-to-rail.

The article actually compares the result to a BBD based delay and states that it's lo-fi...but we've been using BBDs for years and are used to the quirks!  :icon_lol:
"They always say there's nothing new under the sun.  I think that that's a big copout..."  Wayne Shorter

Mark Hammer

One of the enduring topics  of interest in psychology is how we perceive continuity despite the fact that neurons and sensory organs can't really transmit informatiuon on a continuous basis.  Visual, auditory, taste, and touch cells fire, and have to "recharge" to be able to fire again.  We experience life as a kind of movie made up of high-speed presentation of samples that we impose continuity on.  Theatre movies, scrolling LED marquees, are good examples of that, and there is even a tactile phenomenon/illusion called a "rabbit" in which successive touches to the skin in a linear directional fashio *feels* like something being dragged across the skin continuously.  Of course, what is found across all senses is that there exists a kind of critical inter-stimulus interval past which the perception of continuity simply breaks down, and we perceive the stimulus not as a continuous thing but as a bunch of things one after the other..

One of the intrinsic problems with either analog OR digital delay is that they both use a form of sampling.  This puts them in the same category as movie films.  As long as the sample rate is kept up fast enough, you can make a film seem to slow down, but after a certain point, the gaps between successive frames (visual samples) become noticeable and the sense of movement tends to disappear.

When it comes to audio samples, there is a need to have that inter-sample gap be brief enough that some sort of mental continuity can be maintained, and a sound can be heard as ONE sound, rather than a bunch of stitched together snippets.  This is one of the problems, of course, with trying to run analog BBDs too fast.  You would think, on the surface, that faster sampling rate would be "better", right?  Unfortunately, what the clock pin input capacitance on the BBD does to the clock pulse it receives is to round it off so that it takes a little longer for that clock pulse to reach the critical threshold required to move to the next sample.  It becomes a bit like running the projector faster, but sticking a blank frame between every real frame.  Yes, the sample rate is faster, but the gap between samples interferes with the perception of continuity.  The "seamlessness" of samples is often as important as the sample rate.

Going in the complete other direction, clocking a BBD too slowly results in the tiny caps that store the charge representing the absolute level at each sample to drain.  They're like oil paintings, painted rapidly, out in the sun.  When they're fresh, they're accurate, but leave them too long and they start to not look they they did originally.

In the case of digital delay, where control over refresh is not separated from sample rate, slowing down the sample rate to achieve longer delays introduces gaps between samples that you wouldn't detect AS gaps, but which would introduce a sensation of very poor audio quality as well as aliasing.  Certainly you can use steep and powerful lowpass filtering to try and overcome that, but at a certain point you are no longer listening to what you might describe as a recognizable replica of the original.

The notes for the PT2399 and 2396 do not indicate what sort of RAM they use, but the 2395, which uses external RAM for longer delay uses DRAM, so I suspect that the internal RAM of the other chips is likely DRAM as well.  Check the notes for the PT2395.  I don't know enough about such things to determine if the refresh could be independently controlled to maintain the life of what is in the buffer apart from sample rate and delay time.