On November 1, Energy Star will officially launch its revised specification for TVs, version 3.0, which promises to significantly reduce power consumption. After that date, TVs must meet the new spec to carry the Energy Star logo.
It may come as a surprise that prior to the new spec, TVs were only tested in standby mode (plugged in but turned off) to comply with Energy Star. The TVs were never turned on for the test, and the only thing that qualified them for the logo, since 2005, was the ability to draw less than a watt when turned off.
Standby testing is important, of course--TVs, even in America, spend more than 80 percent of the time turned off--but by early this year the majority of TVs on the market have standby draws of less than a watt, which is insignificant considering how much power they draw when turned on.
Version 3.0 finally institutes standards for "power on" certification, and judging from the extensive list of supporting documents at Energy Star's official site, settling on a spec was a long and contentious battle. But now that the spec is in effect, TV power consumption in "power on" mode will lilkely fall across the board. The key, as usual, is in the details of the spec.
A close read of the final draft of the Energy Star 3.0 TV spec (PDF) finds that power-on testing must be made in the default picture mode of the television.
In CNET's own extensive testing, which encompasses every TV we've reviewed since 2006 (and was cited by Energy Star itself (PDF) as a primary data source during development of the spec), we've found that picture mode is one of the most important factors, along with screen size and technology type (e.g. plasma vs. LCD), in how much juice a TV sucks. As a result of Energy Star's new methodology, the default picture modes of compliant TVs change significantly.
Now many HDTVs have a menu available during initial setup that asks whether you'll be using the TV at "home" or in the "store," sometimes labeled "retail." Choosing "home" automatically puts the TV into the "standard" picture mode, which is dimmer and draws less power than previous default modes, which had names like "dynamic" and "vivid." Choosing "store," on the other hand, first prompts a question to make sure you actually do want store mode. Then, if you confirm that you do, the TV will engage the brightest default mode, such as "vivid" or "dynamic."
In some cases the differences between the two modes are drastic. With the
The flipside, of course, is that the standard mode is often dimmer than what would be considered ideal, an issue that varies from model to model. The Panasonic plasma, for example, is so dim in standard that we consider it unwatchable. To achieve a light output we consider ideal on that TV, it consumes 289 watts ($89.11 per year), which is still a major savings over vivid.
Concurrent with the new Energy Star standards, we at CNET have modified our own testing methodology. Check out our updated Quick Guide to TV power, which now includes a chart with 128 HDTVs' power consumption compared. As always, individual HDTV reviews will report on power use in the at the end of the review, which also includes power consumption readings in the ideal, post-calibration picture settings as well as the TV's most efficient "power saver" mode.
What's your take? Will power consumption factor into your next HDTV purchase? Would you sacrifice some picture quality for a more energy-efficient model? Let us know in comments.