Watt loss because of long cable

le0n2009

Well-Known Member
hi guys do you think it will be a problem if i put a 10 meters long wire from led to driver
is there any losses?
 

le0n2009

Well-Known Member
If you're using a constant current driver, no. It will automatically compensate.

If you're using a constant voltage driver, then yes. It's going to be an issue...and a pretty big one.
well ye im using constant current driver
so i will not have losses from the driver?
 

shimz

Well-Known Member
The wire loss will be less due to the higher voltage used in CC, but there will still be some loss. The driver loss is a separate issue, but is about the same for a CC or CV/CC driver of the same output. Please give all the details about driver, what you are driving and how hard, then I can throw some numbers at you.
 

TacoMac

Well-Known Member
well ye im using constant current driver
so i will not have losses from the driver?
You shouldn't, no. But it can still cause issues over a distance that far, namely heat. If it has to jack up the power too much, it can actually melt the wiring. You would be better served to use a very heavy extension cord (at least 12 gauge) and locate the driver closer to the fixture.
 

TacoMac

Well-Known Member
The wire loss will be less due to the higher voltage used in CC, but there will still be some loss. The driver loss is a separate issue, but is about the same for a CC or CV/CC driver of the same output. Please give all the details about driver, what you are driving and how hard, then I can throw some numbers at you.
He already has. Read. He's using a constant current driver. It automatically adjust current to compensate for any distance. If you have absolutely no clue what you're talking about, don't post.
 

shimz

Well-Known Member
Wow dude, just trying to help same as you. He's going to have losses and I can tell exactly how much if I get the details. I guess you're more interested in a grobro answer while I'd like to persue a more scientific approach. No clue what I'm talking about? Not so much.
 

le0n2009

Well-Known Member
You shouldn't, no. But it can still cause issues over a distance that far, namely heat. If it has to jack up the power too much, it can actually melt the wiring. You would be better served to use a very heavy extension cord (at least 12 gauge) and locate the driver closer to the fixture.
its about 5-7 metters or 19 foot
its 185-1400 driver
 

le0n2009

Well-Known Member
Wow dude, just trying to help same as you. He's going to have losses and I can tell exactly how much if I get the details. I guess you're more interested in a grobro answer while I'd like to persue a more scientific approach. No clue what I'm talking about? Not so much.
im powering 2 qb 288 with 185-1400
 

shimz

Well-Known Member
Ok, for 10m distance and at maximum driver voltage (143V) your wires will drop voltage by:

.10% for 12ga
.16% for 14ga
.26% for 16ga
.41% for 16ga

So worst case you lose less than 1W to the 18ga wires. This is why you want to use the CC driver for remote ballast applications.
 

le0n2009

Well-Known Member
Ok, for 10m distance and at maximum driver voltage (143V) your wires will drop voltage by:

.10% for 12ga
.16% for 14ga
.26% for 16ga
.41% for 16ga

So worst case you lose less than 1W to the 18ga wires. This is why you want to use the CC driver for remote ballast applications.
and with 5m?
 

shimz

Well-Known Member
Here is a worst cast scenario with a 24V CV driver. Say you had 24V strips and wired them all in parallel then decided you wanted a remote ballast at 10m. The same 200W output from a CV 24V driver would lose 14.49% of its voltage to the 18ga wires. Almost 30W!
 

shimz

Well-Known Member
You're not helping if you're not paying attention. You're causing confusion. He's already said what he's using. If you're clueless about what it is, as you obviously are, stop posting. It's that fucking simple.
Not sure why you gotta be like that, but I'm sure it's not helping.
 
Top