Input Stage for Digital Guitar Pedal

Started by WilliamR, December 08, 2018, 04:55:47 PM

Previous topic - Next topic

WilliamR

Hi,
I had a spare micro processor laying around and I wanted to use it for a guitar pedal. I've now somewhat designed the input circuit for the ADC. However, since I don't have much experience in designing circuits, I wanted to ask for some feedback. (This is really my first project besides soldering a few cables and playing around with 555er ICs on a breadboard.)

All this circuit has to do is to:
- buffer the input guitar signal with a high input impedance, 1M or something
- gain / drive stage to adjust the input level and saturate the input optionally
- voltage reduction and buffering, the ADC has a low input impedance and is limited to a voltage of 1.8Vpp



A few notes in the circuit:
I only have a 5V supply that I get from the microprocessor, no 9V. I also don't have a negative supply voltage so I need to bias the signal and supply the OpAmps with 5V. Also, I'm not really used to transistors, which is why my circuit is loaded with OpAmps.

So, here was my idea behind the different "stages". I would really appreciate any kind of feedback on issues with the circuit.
The input buffer simply has to buffer the input to create a high input impedance. It also has to offset the voltage since I only have 0V to 5V to power the OpAmps with. I'm not sure if the 1M resistors at this stage are necessary. My thought was that for lower values, the output resistance of the guitar would affect the gain of the OpAmp and would not do what I want it to do. (Which is basically just Vout = Vin + Vref). Is that correct?

The second stage is there to boost and limit (saturate) the input level. The issue here was to not also boost the 2.5V voltage reference. The OpAmp spits out something like Vout = g * (Vin-2.5V) + 2.5V and g is 100k/22k. Alternatively, I could put a capacitor in series with the input resistor to block the voltage reference.

Finally, the last OpAmp is there to simply lower the level from (0..5V) down to 0..1.6V which is a level the ADC can handle.


So, I'm sure that this is complete overkill for what I want to do. Are there some shortcuts I could take to lower the number of OpAmps in action? I'm sure I'll need at least one opAmp in my DAC output circuit as well. Do you have some general recommendations for me?

Thanks in Advance!

amptramp

You will need a capacitor at the input to isolate the first amplifier stage from the variable resistance from the level controls on the guitar.  They would add to the resistance that is in series with the 1 megohm input resistor.  The added capacitor would require a gain of 1 so the 10K resistors in the first stage would be replaced by a line from op amp output to inverting input.  You would already be biased at Vref.  Your ADC driver is sitting at 797 mV at the output so the swing is within reason.

ElectricDruid

#2
There are several things I'd change.

Starting at the front, you need the capacitor Amptramp mentioned, and then if you've got that, people often include a resistor to ground to bleed any voltage off the cap to prevent pops when switching. And a series resistor to limit current in case someone feeds something unexpected in is a good idea, but it doesn't need to be anything like as high as 1M. I use something like this:

https://electricdruid.net/wp-content/uploads/2018/12/StompboxElements.jpg

(Top left for op-amp input buffer)

My buffer is unity-gain, but you've got 10K/10K giving you a gain of x2 on your input buffer, which is fine.

The gain drive stage doesn't need either the series 22K or the 100K to Vref on the + input. You've already established your bias level in your input buffer, so you don't need to add it in again along with the signal. The 22K/100K network on the negative input does go to Vref as you've shown.
Removing the +ve input resistors also allows you to use a single-gang pot, which helps.

Assuming you're using a rail-to-rail op-amp (and you probably should at 5V or you'll have every little headroom) you've got roughly 5V coming out after the gain stage. Your 47K/22K divider drops that down to 1.59V, which is nice and safe for your ADC.

HTH,
Tom

<edit>Calculation error




WilliamR

Hey,

thanks for your replies.

Regarding the input: Took me a while to understand the circuit but it looks good. In my case, the 10k resistors for the gain of 2 were actually necessary because the input would sum both vin and vref with 1M/(2*1M), so 1/2. But with your circuit, that's no longer necessary.

Gain stage: That's neat, removes half of the components but does the exact same.

I'll post an updated schematic later today.

Thanks!

Sooner Boomer

One other thing that *may* be necessary (and maybe not...) is tone shaping.  A high-pass filter at about 10 - 20Hz followed by a low pass filter set to twice the digitization rate of your ADC.  Both of these help with anti-aliasing.
Dan of  ̶9̶  only 5 Toes
I'm not getting older, I'm getting "vintage"

ElectricDruid

For the input highpass, you could use the input buffer I posted with 1M resistors instead of 2M2s, and then reduce the capacitor to 22nF. That gives a cutoff around 15Hz.

For the lowpass, you could put a cap across the 100k pot. The cutoff frequency will get lower as the gain goes up, but with clipping that's usually a good thing since it helps trim the harshness a bit.

Both of these options would only be a single-pole roll-off (-6dB/oct) but they don't require any significant changes to the circuit, so why not? If you want more serious filters, you'll have to redesign.

It might be a good idea to add a resistor in series with the 100K pot to set the minimum gain too, even if it's just a 1K or 4K7 or something (e.g. gain only marginally more than one).


anotherjim

It depends on what you want to do. It might be possible to simplify it a lot.
If your aim is to use the frequency of the guitar signal in your MCU program and you don't intend to sample the waveform as faithfully as possible, then all it needs is a little gain and a comparator. One dual op-amp can do that. Actually, your MCU might already have an analogue comparator input and you don't need ADC at all.

For decent sampling, you need to specify the voltage range of the ADC and build the input circuit to match that. For opamps, you need to know how much voltage swing it can manage, because they can't all reach the supply voltage in or out. With only 5v supply you'd really want a CMOS rail-to rail type suitable for 5v supply.

In this project, I managed input with only one op-amp. It has high input impedance, gain and filtering squeezed into one element. The opamp is powered from 9v, but it could have been done with 5v supply with a different rail-rail op-amp type (the LM358 used can swing down to 0v, but not up to the full 9v)  The ADC range (AREF pin on MCU) has been set a diode drop below 5v by D3 (about 4.3v) and Vref for the op-amp is divided by resistors to half of that. D4 was included for safety so the ADC input can't be driven over 5v and wouldn't be needed if the opamp ran off 5v.



WilliamR

#7
The micro processor board already does some high pass filtering with cutoff at 20Hz and proper anti aliasing as well. Any other filtering is probably going to happen in the digital domain, I'm more comfortable with that anyway. :)  But the idea to lowpass the drive stage might be interesting, especially when some high frequencies are removed when it starts saturating.

Regarding the minimum gain: I've just noticed that this circuit now doesn't go bellow 0dB. So the minimum is (1 + Rf/R1) with Rf being the 100k pot and R1 the as of now 22k resistor. Probably ok because I don't think that I have to go lower with the level. What do you think? I've I would add a capacitor parallel to the pot, that would create a low shelf, minimum gain would still be stuck at 0dB, so no lowpass.

Ah, I've updated the gain stage to create up to 24dB boost, but I assume 12 will probably be sufficient.

Here is an updated version of the circuit



@anotherjim

I want to implement digital effects with the board, so the audio has to be properly converted to digital. the ADC is 16bit, 44k or 88k and the input range is 1.8Vpp, so -0.9V to +0.9V. That's why I drop the 5V down to ~1.66V. And yeah, with 5V head room, I probably have to find some decent OpAmps. Should probably go very close to supply voltage and ideally have nice saturation characteristics. (I don't want it to start hard clipping as soon as it reaches the rail.) And it needs to work with only 5V supply,... Any suggestions? :D (I've noted down the LM358, might be an option)

Regarding the power: If necessary, I could turn this whole thing around, use common 9V supply and then draw the 5V the microprocessor needs from there. That think wants up to 2A though so I probably would have to include a hefty voltage regulator / transformer. And I have no experience with this stuff at all.

ElectricDruid

That input buffer isn't quite right. The input goes to the +ve input, not the -ve.

You could combine the buffer and the gain stage too. Just use your 100K/6K8 resistors on the input amp instead. Saves an op-amp.

Finally, what on earth type of microprocessor is this that draws 2A? Mostly these days the current is way, way down in the mA or uA.

ElectricDruid

One other thought - rather than depend on the op-amp having a good clipping behaviour, why not add a pair of clipping diodes. That might give you more flexibility about the level too.

WilliamR

Input Stage: yeah, that was an oversight. Needs to go to V+. Good point on combining the two stages as well.
About the diodes: do you have an exampleon how to do that? Sounds like a good idea.

The board is a Bela Board, based on a BeagleBone Black. I think official consumption of the board itself is ~500mA but with the cape and a Buch of stuff going on, I've seen far higher numbers on the web.