Expression pedal variable LED driver

Started by egasimus, September 28, 2011, 03:18:22 PM

Previous topic - Next topic

egasimus

I'm trying to replace a potentiometer in a circuit with a LED/LDR combo driven by any expression pedal. Has anyone tried this? I'm thinking of a transistor to control the current through the LED, + two trimmers to set min and max current, but I absolutely fail at understanding transistors. Can anyone help?

Gurner

#1
This tasks screams out 8 pin "PIC"  - connect the expression pedal pot's top & bottom lug to two spare PIC IO pins (one feeds 0v the other feeds the PIC's supply to the lugs)....feed the pot's wiper to an ADC pin, map the ADC sample to a PWM duty cycle (eg a look up table - this way can linearize an otherwise non linear LED) - feed the PWM stream through a current limiting resistor to the LDR's LED  job done. About an hour's coding work from scratch.

If you wanna go old school analogue - rig up your expression pedal pot's wiper to a bog standard voltage follower. Arrange it (with a resistor divider chain), so that the lugs of the expression pot have the LED's voltage extremeties, fo example for a RED LED...

2.1Vpot top lug

pot wiper----------------------------------------->voltage follower ---------->series resistor------LED.


1.7V pot bottom lug

The Tone God


egasimus

#3
The Rock'n'Control looks exactly like what I need. I know I had seen this somewhere.

Connecting the pedal as a variable resistance to ground, though, makes the circuit dependent on the expression pedal pot value. I'd prefer to use all 3 connections, and take a voltage from the wiper instead.

However, some expression pedals don't use the entire travel of the pot, and LDRs generally have a very large dark resistance. So the circuit should include one trimmer to set how much the LED is lit at the minimum position (pedal heel down) so that the LDR has the required maximum resistance (500k in my case), and a trimmer for the gain of the voltage buffer, so that at the maximum position (pedal heel down) the LDR has the required minimum resistance (in my case, as close to 0 ohms as possible).

I could probably design this easily using an opamp, but it looks like overkill to me. I want to understand transistors.
This is what I'm thinking of. However, I'm not sure about the values. What would be a good maximum gain for the opamp? I guess I'll have to find this out on the breadboard. Also, I'm not really sure if that's the best place to put the CV min pot - tuning the circuit would have to be iterative, since the two trimmers interact.

egasimus

Hey, a little bump for this thread. Check out the simulation in my post above. As it is, the voltage "floor" is set using a trimmer at the input, and is thus dependent on the opamp gain. I want to be able to set a fixed "floor" (minimum voltage) using a trimmer. How do I do that?

Also, can a TL0x2 drive a LED in this manner, or should I use something like a LM386?

egasimus

Update: a LM386 has a minimum gain of 20, so isn't suitable for this purpose. Half a TL072 happily drives a LED, though.