So probably a stupid question but

Started by vendettav, August 24, 2012, 09:28:37 AM

Previous topic - Next topic

vendettav

So I'm building a little lighting for my new table... again  :P

what I have to ask is. I have 3 clear blue ultra bright blah blah LEDs. I have 5v and I am hooking them in parallel. Now here's the thing. LED calculator says I need a 100ohm resistor. The lowest I have right now is 2k. So each LED is hooked up to a 2k resister. I'm liking the brightness. Now could anybody tell me whats going on? am I using more current? how do I know how much wattage it consumes? I guess what I'm doing is just making the LEDs dimmer and that is it but then again I didn't sleep fine tonight and havent had coffee since yesterday so I'm dying.. This could also turn out to be a bullshit post hence the name of the thread.


Help?  :icon_redface:
check my music HERE

Shredtastic psycho metal!

Gurner

#1
A blue LED will drop approx 3.3V to 3.4V (lets say 3.3V)

therefore for a 5V supply that means the remaining voltage (1.7V) must get dropped across your series 2k resistor.

To establish the current running through the LED, divide the 'remaining voltage' (1.7V) by the resistance, therefore 1.7/2000= 850uA (0.00085A)  running through your LED ...that ain't a lot at all (you'd normally want about 20mA to 25mA to get the LED running at max brightness), but neverless if you're happy, you're happy.

defaced

-Mike

Kesh

LED brightness is not linear with current. In fact over a certain current, which is quite a bit less than max, it's very hard to discern the difference. There's a experiment with photos somewhere on the internet.

I wouldn't use blue LEDs alone though, it's much easier to see under light with a broad spectrum.

vendettav

thanks guys, seems like it'll work for me. Although I have to check it out about now when it's dark outside... and inside :D
check my music HERE

Shredtastic psycho metal!