How much is your annual Electricity usage. Mine is 5,145 kWhs

3429kWh, 4 bed detached. Work from home, so kettle boiling every hour and laptop/monitor in use constantly. Induction hob and fan oven. I assume that all other posters are gas central and water heating also? I think we are fairly careful with 'leccy, but not sure about the gas at 15000kWh?!
 
Home plug in energy meters generally indicate voltage and current but may not take account of power factor.
Perhaps surprisingly, most of them (including many of the cheapo, sub-£10, ones) apparently do take PF into account, and a good few of them can actually display PF. It's the 'clamp on' energy monitors which can't (unless they have other inputs).
If that is the case then what they indicate as "power" may in fact be simply "volts amps". This might explain why 'electronic' lamps do not consume the energy expected since they are predominately capacitive devices.
If the meter were displaying VA (but calling it 'Watts') that would presumably result in an over-estimate of the true power if the load were reactive, wouldn't it?

Kind Regards, John
 
dont think theres anything wrong its to keep it standard
after all you take a 10w led chip at 12v its a 10w led
you take the same chip put it in a mains powered circuit its still called a 10w chip because that's the output that is still 12v but the input voltage is much higher so at 100% efficient with no conversion losses the consumption would be sub 1w so at 6w it is inefficient but less so than a say 10w one
 
after all you take a 10w led chip at 12v its a 10w led ... you take the same chip put it in a mains powered circuit its still called a 10w chip because that's the output that is still 12v but the input voltage is much higher so at 100% efficient with no conversion losses the consumption would be sub 1w so at 6w it is inefficient but less so than a say 10w one
I'm afraid that makes no sense to me. If the power consumption (at 220-240V) is 1W (or 6W), then there is no sensible sense in which the lamp can be described as a 10W one (you said 12W before, but that doesn't change anything).

Indeed, if the consumption were 1W (or 6W), then the "10W/12V led chip" could not be getting 12V.

Kind Regards, John
 
ok take 20x10wx12v in series the input voltage will be say 240v x10w and each led will get 12 volts and you now have 20 times the amount off light for the same 10w and 240v --
---unless i am misunderstanding something:confused:
 
ok take 20x10wx12v in series the input voltage will be say 240v x10w and each led will get 12 volts and you now have 20 times the amount off light for the same 10w and 240v -- ---unless i am misunderstanding something:confused:
Yes, I think you are overlooking/misunderstanding the difference in voltages.

If each LED lamp consumes 10W at 12V, that means that, in normal operation, about 0.83A flows through each of them. If you put 20 in series, ~0.83A flows through that chain of lamps, the total voltage across them (end-to-end) being 240V. The amount of power taken from the 240V source will therefore be 240 x 0.83 Watts, namely about 200W (i.e. the same as the 20 x 10W lamps), not 10W.

Kind Regards, John
 
so if the difference is not on the efficiency off the input side it must be more visible light per watt given out by the led
i go by the percived light i see rather than the rated wattage
the bulb above my head is rated as 12w[5.9w] it give out perhaps 20% less visible light compared to a toolstation 12w bulb that actually uses 12w but chose the 5.9w consumption as being more than bright enough at half the running costs
 
so if the difference is not on the efficiency off the input side it must be more visible light per watt given out by the led ... i go by the percived light i see rather than the rated wattage
The overall lumens/watt (i.e. the number of lumens per watt supplied at 230V/240V - which is what matters in terms of running costs) obviously will be crucially dependent on the "efficiency of the input side" - i.e. how efficient is the conversion from 230/240V to the voltage (12V or whatever) required by the lamp itself.

Indeed, there are two potentially inefficient stages - firstly the conversion of mains voltage (230/240V) to that required by the "12V lamp" and, secondly, the conversion from 12V to the voltage needed by the LED element(s) within the lamp. Either or both of those processes can result in a reduction in 'overall' lumens/watt (which is what determines running cost).
the bulb above my head is rated as 12w[5.9w] it give out perhaps 20% less visible light compared to a toolstation 12w bulb that actually uses 12w but chose the 5.9w consumption as being more than bright enough at half the running costs
As I said before, if a lamp is only consuming 5.9W, the very most that could be getting to the LED element(s) is 5.9W (in practice, less than that).

It is possible that the overall efficiency (lumens/watt) is greater for your lamp drawing 5.9W than the Toolstation one, in which case the former could be brighter than you would expect (roughly half the brightness of the latter) if both had the same efficiency. However, that difference in efficiencies could be due to lots of reasons, not necessarily the difference in working voltage range that you originally suggested. One with a very wide input voltage range could be designed to be very efficient, but the cost might be prohibitive.

Kind Regards, John
 
Ouch! 12450kWh here...

Gas heating / hot water. Cooking all electric. Have a hot tub and electric UFH in kitchen and two bathrooms. But still...
 
4 Bed Detached, 1951 but now well insulated. Just over 7,000 kWh. Gas heating and HW (21,000 kWh o_O), Cooking all Electric. A few well watched large plasma TVs dotted around the house which probably don't help.
 
Last edited:
3 bed semi: 2,900 Kwh for the years leccy, I'm pretty paranoid with watching our energy useage! Cooking and heating/HW is gas.
 
Back
Top