BundyRoy
Dedicated elf
- Joined
- Apr 9, 2014
- Messages
- 1,026
I have worked out my layout and would like to just double check I have a handle on the issue of voltage drop.
I plan on having a matrix of 10 strips of 3m long 12V ink1003 30 leds/m (this). It will be 9m one way from my controller so say 10m one way from the psu.
Wasn't sure what current to use. So the specs say 7.2W/m. So based on P=VI then 7.2=12*I. I=.6A. First question:
Is that 0.6A/m or 0.6A total or completely wrong?
Now say I wanted to check voltage drop with 14/0.20 cable. From the bible it has a resistance of 0.043 ohms/m. Second question:
Is the resistance calculated for the cable length (20m in this case) or cable length plus light strip length ?
I will assume it is just the 20m of cable in order to continue my working. The R=20x0.043=0.86 Ohms
So V=IR. V=0.6x0.86=0.52V
So assuming I have assumed correctly (not that likely) I have a voltage drop of 0.52V. Question 3:
What voltage drop is acceptable?
I seem to remember something about a tolerance of 5%. This equates to 12x.05=0.6V. By my numbers I am getting close to this and should look at getting more cross sectional area of wire.
Question 4:
If I have a cable that has enough cores so that I could double up the wires on the +ve side but only run a single wire on the -ve does that help? Or should I stop mucking around and get bigger wire.
I haven't ordered any wire yet. Hence why I'm checking calculations and options now.
Other relevant info that may be required:
30m x 30LED/m = 900 Pixels = 2700 channels.
Using a Pixlite 16 controller. Handles 340 pixels/output.
I have 3m x 30 pixel/m= 90 pixel/strip.
Need to use minimum of 4 outputs. (2 lots 270 pixel, 2 lots of 180 pixel.)
I assume the power requirements are okay for one PSU. 30m x 7.2W/m = 216W. This equals 216/350W = 62% load.
Sorry for the long post and multiple questions but it is hard (at least for me) to look at each issue in isolation.
Thanks
Roy
I plan on having a matrix of 10 strips of 3m long 12V ink1003 30 leds/m (this). It will be 9m one way from my controller so say 10m one way from the psu.
Wasn't sure what current to use. So the specs say 7.2W/m. So based on P=VI then 7.2=12*I. I=.6A. First question:
Is that 0.6A/m or 0.6A total or completely wrong?
Now say I wanted to check voltage drop with 14/0.20 cable. From the bible it has a resistance of 0.043 ohms/m. Second question:
Is the resistance calculated for the cable length (20m in this case) or cable length plus light strip length ?
I will assume it is just the 20m of cable in order to continue my working. The R=20x0.043=0.86 Ohms
So V=IR. V=0.6x0.86=0.52V
So assuming I have assumed correctly (not that likely) I have a voltage drop of 0.52V. Question 3:
What voltage drop is acceptable?
I seem to remember something about a tolerance of 5%. This equates to 12x.05=0.6V. By my numbers I am getting close to this and should look at getting more cross sectional area of wire.
Question 4:
If I have a cable that has enough cores so that I could double up the wires on the +ve side but only run a single wire on the -ve does that help? Or should I stop mucking around and get bigger wire.
I haven't ordered any wire yet. Hence why I'm checking calculations and options now.
Other relevant info that may be required:
30m x 30LED/m = 900 Pixels = 2700 channels.
Using a Pixlite 16 controller. Handles 340 pixels/output.
I have 3m x 30 pixel/m= 90 pixel/strip.
Need to use minimum of 4 outputs. (2 lots 270 pixel, 2 lots of 180 pixel.)
I assume the power requirements are okay for one PSU. 30m x 7.2W/m = 216W. This equals 216/350W = 62% load.
Sorry for the long post and multiple questions but it is hard (at least for me) to look at each issue in isolation.
Thanks
Roy