David-
This is my take on the question you raise...
First, a minor correction: The specific diagram I think you are referring to does not show the the diodes in parallel, but rather, in series.
The voltage drop across each LED depends upon the device. If, for the sake of argument, each LED had a 2 volt drop across it, the total drop across a chain of three LEDs would be (3 x 2.0) = 6 volts. If we assume a supply of 12 volts, and a desire to limit the current to 20 mA, this means we need a series resistor in each chain (of three LEDs) on the order of ((12.0 - 6.0) / 0.020) = 300 ohms. So, as I see it, you are correct about the 330 ohm resistors.
On the other hand, you aren't simply connecting the array to a static 12 volt supply. Presumably, it will be driven by the the LED driver circuit, also on the web page. This changes the game a little bit.
Note that because there are resistors between the source of the mosfet and ground, the voltage appearing at the source terminal (of the mosfet) will be proportional to the current being fed through the LED array. By sampling this voltage and feeding it back to the opamp, the circuit essentially meters LED current as function of video voltage. Since the maximum video voltage is defined, you can limit max LED current by your choice of feedback resistors.
Stated more simply, it appears that the max LED array current is being scaled and limited by other parts of the circuitry than just the series resistors in the LED array itself.
Pete
AC7ZL
Anonymous wrote:Hi,
I'm a complete novice and I'm trying to build my first mecahnical television. I'm using a Peter Smith design (from the NBTVA website
http://www.nbtv.wyenet.co.uk/beginners.htm )
However, I'm confused to tha choice of resistors used in the LED arrays. The circuit shows 2 rows of 3 LED's in parallel, each row with a 22R resistor in it. If the supply voltage is 12V, the LED's 2V 20mA I work out that the 2 resistors need to be 330R. Am I right or am I missing something?
Many Thanks
David