Clearly I'm missing something obvious here but last night's episode of Dragons' Den when everyone wanted to save the planet with a standby saver has me troubled...
The Standby Saver (or whatever it was called) claims to save the (significant amount of) electricity your TV / VCR / DigiBox etc uses while on Standby. This is a subject close to my heart as our Samsung TV has no off button - just a standby button - and I have a conscience so I worry about it. The demo showed the four-way bar socket into which the unit is integrated consuming about 70W while the TV was running, about 15W while the TV was on Standby and 0W when the "Standby Saver" was enabled.
As I understood the explanation, the unit uses re-chargeable batteries that charge from the mains when the TV is running and a small microprocessor that learns from your remote control and kicks in when you switch the TV to standby to cutoff the power 100% (and then, one assumes, continues to run waiting for you to switch the TV on again). All makes sense so far. But does it make good environmental sense?
Well hold on, what about the current required to keep those batteries charged? How does that compare with the usage in standby? And I notice the demo nicely avoided this issue altogether because we don't know what the TV power usage is without the batteries "in the loop". Call me cynical if you like but I am suspicious. I do see you could get a benefit if you're driving 4 units from the 4-way (but then you have to be happy that they all go off when your TV gets switched off I assume). But everyone got very excited when they talked about building this functionality directly into the TV itself.
Hold on a minute I cry? Is this a perpetual motion machine I see before me? Other than placing on old fashioned on/off switch on the TV which would do the job perfectly adequately (but has the downside that you have to get off your *rse to switch on the TV), you need to draw some current while sitting waiting for that signal from the remote that says "I want Pop Idol and I want it now and I want it without having to get off my *rse!". So, we've just moved that current draw from a direct draw from the mains PSU to some re-chargeable batteries (and I assume some significant inefficiency in the charging process).
So could someone who understands this better than me confirm (as this is the only thing I can think of) that the benefit is that the mains PSU is extremely inefficient at supplying the tiny current required to keep the TV in standby and therefore the re-chargeable battery approach is a better one? Because right now this just looks like emperor's new clothes to me (I do sincerely hope it isn't). What astounds me (if this is the case) is that Sony / Samsung / Toshiba / (insert big name TV manufacturer here) didn't come up with this solution years ago if it really does make the sort of difference that was being claimed.
PS Why they talked about VCRs I don't know because unless the thing is clever enough to know that I set it to record Pop Idol...
Actually it could be that the TV's PSU is inefficient for standby; it is, after all, designed for optimum performance when the TV is on. One wonders whether this is sensible given that the TV spends the majority of its time off. Maybe they have separate transformers, although I somehow doubt that.
What worries me more is that you watch Pop Idol.
This product already exists, I have two, bought on ebay Germany. They've been around for some years, even sold on QVC. Total fraud by the so called inventor if u ask me
My mistake Dave - of course I meant *American* Idol. I also know for a fact that one of my colleagues records every episode, day and night, first showing or repeat. ITV1, ITV2, ITV3, ITV4, ITV4+1, ITV6-3 or whatever all these channels are. It's not out of choice though, his media center has taken a keen interest in "educating" him... :-)
This product is already on the market from another manufacturer, looks lie someone beat them to it: http://www.thesavasocket.co.uk/
Someone's been very busy this afternoon.
Everywhere there's a blog about the standby saver, there's this spamming link ;-)
Removed it from my blog, left the comment 'empty' floating in nothingness, as it should
do you know what is the name of the product that you saw on qvc and ebay germany ?
I agree that this doesn't work. Using their logic I could claim that my mobile phone is eco friendly as while I'm using it, it's not using any electricity from the mains and therefore must mean 0 carbon emmisions and savings on electricity bills. This completely disregards the fact that it was charged up earlier.
Taking into account the fact that re-chargeable batteries are not 100% efficient, this invention actually increases standby wastagem, so It's an absolute con.
lol... muppets ;)
The whole point of the saving is that a small rechargable battery only needs to power a microcontroller that holds information about the units on standby via its remote control.
The rechargble battery need only be the size of your digital watch or motherboard battery. The power needed to charge the battery is over 100x less than the power you use for a TV on standy in 2 minutes!
And the fat that you can plug many appliances into it saves even more power.
The money is also with the patented design. Companies will soon be under mounting pressure by new EU directives under power saving initiatives. So companies like sony will need to pay to be able to use this method of power saving.
Ollie - you're essentially repeating the same two "possible benefits" points I made in my original post - ie driving multiple units from the same 4 way or charging batteries while the TV is on being much more efficient than drawing standby power while it's "off".
So I don't disagree with you but you haven't offered anything to back up the statement "The power needed to charge the battery is over 100x less than the power you use for a TV on standy in 2 minutes!". If that's true then it's certainly an interesting way to address the issue. The "multiple devices" is a bit dubious because you can't connect any form of recorder so options are a bit limited. Depends how much AV kit you have I suppose.
I expect the batteries will have to draw a lot more current than you suggest (ie just powering a microcontroller). Wont they also need to be able to drive some sort of switching device / relay in order to control the supply to the TV?
The switching device / relay is surely irrelevant as every appliance must have one... When ever you turn a appliance on or off this will be the case.
The battery wont be able to draw current enough to perform this task either! it would need to be too big. The battery only powers the microcontroller circuit, but the microcontroller generates a simple logic output that alerts the switching relay to change. You do not couple high and low voltage circuits in consumer electronics together since it is dangerous.
The switching circuit will be controlled by the mains itself - not the battery, just as it would be if you turn your TV off at the moment.
The point of this device is that the power consumption of this design is a hell of a lot less than the power consumption of, say, your TV in standby. This is due to the nature of the minimum loading of the PSU in the TV itself. It is a constraint that limits the efficiency of the standby mode.
You are placing to much emphasis on a small battery!
Put it this way - do you believe that if you hooked up your TV to a rechargable battery the size of a 2 pence coin that it would be able to produce the power to run your TV in standby mode?
The answer is no.
So to summarise...
We want a TV to be in standby so that we can turn it on with the remote control still.
The TV in standby mode HAS to draw MUCH MORE POWER than it NEEDS to. This is because the power supply unit inside CANNOT supply any LESS POWER without FAILING. This is a CONSTRAINT of the PSU, NOT a design floor.
So to limit the POWER of standby we are turning the TV completely OFF, but running a small circuit with TINY power consumption instead. This small circuit interfaces with your TV REMOTE. So your TV remote still works! The small circuit simply alerts the switching relay to trigger again via a low power LOGIC signal.
Do you understand? Please let me know if you want to know any more.
Hi Ollie. Well I think we're talking a bit at cross purposes because the bit I don't understand your (excellent) explanation doesn't address. And all the bits I do understand it does. :-) So I have no argument / questions about most of what you're written.
Let's boil it down just to the power-saver device itself which consists of basically 3 things right?
- Some sort of circuit and detector that can learn my remote control standby pattern and thus tigger...
- a switching circuit to physically isolate the mains
- a rechargeable battery to power the above
Assuming the "normal" state for the switching circuit is off (ie no mains power to the TV) then when I switch on my TV, the "trigger" circuit must drive the switching circuit to an "on" state. Assuming the switching circuit can then "latch" in that state (and doesn't need to be constantly driven by the "trigger" circuit) then I can see how this would work. I'd been thinking that the battery would need to provide some sort of constant drive to the switching circuit.
So at the end of the day it does boil down to the fact that the TV PSU is extremely inefficient for the very small current required to power a standby circuit. It just seems that their ought to be a more elegant way than rechargeable batteries to address this? But if the process of stepping down and smoothing "high" voltage AC is inherently inefficient for low current applications then I guess not...
Mike, You seemed to have hit the nail on the head...
The TV PSU is inefficient at running low current applications on there own since it is designed to run much higher current.
So for the PSU to run a small current application, it also needs to run many other functions to make sure its minimum load is satisfied.
Its not that the PSU is inefficient - its just designed for the TV, not the standby circuit.
As you have gathered, the problem with using mains to power standby is that you are using a massive source of electricity to power a tiny application. This is where the battery comes in... why use a constant 230V AC source, when a 3v battery is enough?
Its not that stepping down and smoothing the AC voltage is inefficient, its that the process of doing so uses a lot more electricity than the battery.
I also can accept the argument that the PSU for a TV are/were inefficient for extremely low standby power consumption - if indeed that's the approach taken in modern TV designs. However I'm not convinced it it is. See the IAE 1watt initiative launched in 2002 to bring household appliances below 1watt in standby mode - see http://www.iea.org/textbase/papers/2002/globe02.pdf. They cite the availability of high-efficiency 100milliWatt power transformers as one of the key technical enablers to achieve this with. Therefore I'm not convinced they'll succeed in selling Standby Saver technology to TV manufacturers to include it inside the TV which was one of reasons the dragons invested - it sounds like there are simpler patent-free cheaper ways to achieve much the same thing.
Without knowing the exact power levels and efficiencies involved, it will be difficult to say with certainty whether Standby Saver's approach is more power efficient. I suspect it may be a good idea with some older TVs but not with newer models. There's seems to quite a lot of variation in TV's standby power consumption. This survey (http://reviews.cnet.com/4520-6475_7-6400401-2.html) shows at least a tenfold difference. Spec sheet for new TV are claiming even lower rating e.g. Sony 32" Bravia at less than 0.3 Watts.
I recently measured my seven-year old Sony TV (to setlle an argument) and was pleasantly surprised to discover it only uses about 1 Watt in stand-by mode! So it might have to be a very old TV before Standby Saver becomes the right choice - at 8.76kWh per year my TV would cost less than £1/year left on standby and so it would take a long time to pay back the £25 cost. Even if I looked at it from a eco-friendly perspective rather than a purely financial aspect, I'm not completely convinced the energy saved (inefficient battery charging versus < 1Watt standby mode) completely offsets the energy used to manufacture and transport the Standby Saver and its components unless you own a very old or inefficient TV. Out of curiousity just worked out 1Watt = 1 Joule/second for a year = 31.5MJoules or a bit less than the energy in 1 litre petrol or oil - see http://en.wikipedia.org/wiki/Fuel_efficiency.
However having said all of the above I am slightly tempted to buy one for its other claimed use - with their USB sensor for use with my laptop. I've discovered my laptop transformer seems to use 15W even when the laptop battery is fully charged and switched off - and when my laptop is powered down, I don't mind the other sockets for docking station and printer being powered down. I could of course simply switch them off at the wall ;-)
Clearly my " what the heck is a dibbler " question was a little too obscure as nobody came very close.