- Thread starter
- #1

I have seen and been asked this question many times and thought i might show why power injection is required for low voltage applications and why a low voltage strip doesnt get the same length as a rope light

It has to do with Ohms law and the resistance of a cable, the voltage and the current.

An example is the difference between a 240v rope light and a 12vdc RGB strip, a 240 volt rope light can get much further runs then a 12vdc strip

so with some calculations we can show the difference the voltage makes to current for the same rated wattage used.

In this example we will use a 100 watt load

So the current required would be:

Current = power(watts) / Voltage

Current = 100 watts / 240 volts

= 0.42 amps @ 240 volts

Now for the same wattage used but running at 12vdc

current = 100 watts / 12 volts

= 8.3 amps @ 12 volts

The reason that strips are only 12vdc and 5vdc is because an LED has a rated forward voltage that it will run with, so the LEDs are connnected in series.

Example. An LED may have a forward voltage rating of 3 volts, if you connect 3 of theses LEDs in series you will get 9 volts and 3 volts will be required to be dropped through a resistor. This allows the strips to have a much smaller cuttable section than what is found on rope lights. If a strip was rated at 240 volts or a 240v rope light used a similar rated LED then this would require around 70+ LEDs connected in series to become an efficient LED circuit.

Now the next thing is resistance of a cable which effects the voltage and this as well can be shown using ohms law.

Voltage = Current (amps) x resistance (ohms)

So different cable sizes have a different resistance value, generally the larger the cable the lower the resistance.

20 guage (approx 1mm2) wire has a resitance rating of 1.28 ohms per 100 feet

16 guage (approx 2.5mm2) wire has a resitance rating of 0.40 ohms per 100 feet

So the voltage drop differences for a 5 amp load over 100 feet would look like this

20 gauge (1mm2): Voltage = 5 amps x 1.28 ohms = 6.4 volts dropped over 100 feet

16 gauge(2.5mm2): Voltage = 5 amps x 0.40 ohms = 2 volts dropped over 100 feet

So by using a lower voltage the % of voltage drop is much higher with lower voltages than with higher voltages over a given distance and this is why cable choice is also very important when using low voltage lighting and why power injection is used to get these extra distances.

I hope that makes it a bit more easier to understand why it is so

It has to do with Ohms law and the resistance of a cable, the voltage and the current.

An example is the difference between a 240v rope light and a 12vdc RGB strip, a 240 volt rope light can get much further runs then a 12vdc strip

so with some calculations we can show the difference the voltage makes to current for the same rated wattage used.

In this example we will use a 100 watt load

So the current required would be:

Current = power(watts) / Voltage

Current = 100 watts / 240 volts

= 0.42 amps @ 240 volts

Now for the same wattage used but running at 12vdc

current = 100 watts / 12 volts

= 8.3 amps @ 12 volts

The reason that strips are only 12vdc and 5vdc is because an LED has a rated forward voltage that it will run with, so the LEDs are connnected in series.

Example. An LED may have a forward voltage rating of 3 volts, if you connect 3 of theses LEDs in series you will get 9 volts and 3 volts will be required to be dropped through a resistor. This allows the strips to have a much smaller cuttable section than what is found on rope lights. If a strip was rated at 240 volts or a 240v rope light used a similar rated LED then this would require around 70+ LEDs connected in series to become an efficient LED circuit.

Now the next thing is resistance of a cable which effects the voltage and this as well can be shown using ohms law.

Voltage = Current (amps) x resistance (ohms)

So different cable sizes have a different resistance value, generally the larger the cable the lower the resistance.

20 guage (approx 1mm2) wire has a resitance rating of 1.28 ohms per 100 feet

16 guage (approx 2.5mm2) wire has a resitance rating of 0.40 ohms per 100 feet

So the voltage drop differences for a 5 amp load over 100 feet would look like this

20 gauge (1mm2): Voltage = 5 amps x 1.28 ohms = 6.4 volts dropped over 100 feet

16 gauge(2.5mm2): Voltage = 5 amps x 0.40 ohms = 2 volts dropped over 100 feet

So by using a lower voltage the % of voltage drop is much higher with lower voltages than with higher voltages over a given distance and this is why cable choice is also very important when using low voltage lighting and why power injection is used to get these extra distances.

I hope that makes it a bit more easier to understand why it is so