so if the difference is not on the efficiency off the input side it must be more visible light per watt given out by the led ... i go by the percived light i see rather than the rated wattage
The
overall lumens/watt (i.e. the number of lumens per watt supplied at 230V/240V - which is what matters in terms of running costs) obviously
will be crucially dependent on the "efficiency of the input side" - i.e. how efficient is the conversion from 230/240V to the voltage (12V or whatever) required by the lamp itself.
Indeed, there are
two potentially inefficient stages - firstly the conversion of mains voltage (230/240V) to that required by the "12V lamp" and, secondly, the conversion from 12V to the voltage needed by the LED element(s) within the lamp. Either or both of those processes can result in a reduction in 'overall' lumens/watt (which is what determines running cost).
the bulb above my head is rated as 12w[5.9w] it give out perhaps 20% less visible light compared to a toolstation 12w bulb that actually uses 12w but chose the 5.9w consumption as being more than bright enough at half the running costs
As I said before, if a lamp is only consuming 5.9W, the very most that could be getting to the LED element(s) is 5.9W (in practice, less than that).
It is possible that the overall efficiency (lumens/watt) is greater for your lamp drawing 5.9W than the Toolstation one, in which case the former could be brighter than you would expect (roughly half the brightness of the latter) if both had the same efficiency. However, that difference in efficiencies could be due to lots of reasons, not necessarily the difference in working voltage range that you originally suggested. One with a very wide input voltage range
could be designed to be very efficient, but the cost might be prohibitive.
Kind Regards, John