![]() |
![]() |
![]() |
#1 |
Registered Member
Join Date: Feb 2008
Location: Snohomish, WA
Posts: 41
|
LED v T5 v MH & watts
I propose a dedicated thread to once and for all arrive at the ultimate answer to a question (this is starting to sound like something from Douglas Adams) that has threatened to highjack several threads in the past. Namely, what’s the difference between the light/heat output of 100W of LED light, fluorescent, or metal halide light? I thought I understood this concept, but now I’m not sure. I’m hoping this discussion starter will help in my on-going education (the part that my wife doesn’t cover).
First, a few rules of conduct: Please, be polite Accept the possibility that you might be mistaken No name-calling, put-downs, or questioning anyone’s ancestry Make every attempt to be accurate Okay, I’ll go first. Let’s start with some accuracy. Watts is power, not energy. It’s the rate of energy consumption (I’m thinking of a light source here). A watt is a joule per second, i.e., J/s. Energy is just joules. Additionally, energy is not measured, work is measured and used as an indirect measurement of energy. If 100J of work is done, then 100J of energy (whatever that is) must have been used up. No energy: no work. No work: can’t tell about energy. Efficiency. Here’s a word that gets used a lot, but seems to mean different things to different people. When I think of the efficiency of a light source, I think of the proportion of the energy going into the lighting unit that gets turned into usable light. This leads to other semantical questions. Is the energy going into the light what matters, or only the energy consumed? I ask this not believing that every single bit of electrical energy input is converted into other forms of energy. Furthermore, what is usable light? Is PAR what is being used to calculate efficiency, or any em radiation in the visible spectrum? I would very much like to have this clearly defined. It has been asked that if a 100W LED is put into a styrofoam cooler (a close approximation to a perfect insulator, at least short term) how would the change in temperature differ from a 100W MH or any other light source? I believe the answer is that the temperature change would be the same, hence the comment “a watt is a watt.” Fair enough, however, as all energy eventually ends up as heat (not the same thing as temperature remember from chem class), it seems to me that the real question that we should be asking is what form does that energy coming from the light source take on its way to eventually becoming heat? We finally arrive at my conundrum. Doesn’t a 100W LED initially produce more usable light than a 100W fluorescent, which in turn produces more usable light than a 100W MH? I accept that all that light eventually turns into heat, but doesn’t the LED initially produce more visible light and less heat? Furthermore, Doesn’t the LED light get converted into heat in the surroundings, e.g., our fish rooms, to a greater extent as it is absorbed and reradiated? Isn’t that why LED are considered more “efficient” because they initially produce more visible light and less heat than other light sources? Let the game begin. |
![]() |
![]() |
![]() |
|
|