This is based on a problem that came up today. During the course of this problem I realized that I wasn't so sure I understood the relationship between wattage and heat produced.
In the past we did a test in the lab using .305 Ω /ft wire. The jacket is rated for 150C. We were able to get about 6.5A (at 3.66V) out of it at 24C ambient, with out exceeding the jacket rating. I want to estimate what the ampacity of 0.027 Ω/ft wire is. So I am wondering if I did it correctly, because this amperage seems a little high for the wire to handle, then again most copper wire is only rated at 90C.
So the math I did on it was this
So you do .305 * 2 = .61 Ω /ft 6A^2 * .61 Ω = 21.62W (I^2*r = W) 21.96W * / 2ft = 10.98 W/ ft
So would it be safe to assume that if I did the same with a 0.027w/ft wire with the same jacked I would arrive at this amperage?
If you start with 11W/ft * 2ft = 22W 0.027 Ω * 2ft = .054 Ω Sqrt(22W/(0.054 Ω)) = 20.18A
ETA: we are planning on testing this tomorrow when we get some wire in. So we shall find out.