This is the year of the 4K high-definition television, as the prices drop and 4K content starts getting pushed down the pipe. But in this era where everything is getting more efficient, in the TV world it's going in the other direction. In fact, according to a study by the Natural Resources Defense Council, the new TVs use about 30 percent more electricity than the previous generation, which may add up to additional consumption of 8 billion kWh, as much power used by three San Francisco-sized cities. It pretty much erases all the gains made in the last few years as people converted to high-definition flat screens.

infographic of TV impactBig screens need big power. (Photo: NRDC)

The reasons for the increase in power consumption are complex:

People are buying bigger screens.

50 is the new 34 as the most popular screen sizes go up. There used to be a formula of screen size vs. viewing distance because the picture resolution would break down into dots if you got too close, but these days, with 4,000 pixels across, it’s just BIGGER is BETTER. This means a larger area that needs to be backlit.

4K TVs takes more power to operate:

How much? As much as 30 percent more. The study authors note that “the primary reason is that their backlights need to be brighter to deliver comparable or higher luminance and more vivid colours through a larger number of pixels.”

screen resolutionComparison of screen resolution (Photo: NRDC)

4K content takes more power to run:

Just processing and presenting the content takes about 10 percent more energy than running regular content. So as Netflix and other streaming companies take more business away from cable and deliver more high-definition programming, the sets will consume even more energy.

One new feature, HDR, uses a LOT more energy

HDR stands for High Dynamic Range, a new feature on some TVs. It delivers blacker blacks and a much brighter picture. The study authors make it sound like something to look forward to:

HDR-encoded material can provide visually striking imagery that is more vivid and wide ranging in tonality than the original itself. Specially designed digital cameras and software capture multiple images per frame, each of which represents a portion of the range of visible light and dark tones. When these are combined, the final, composite image is contrast-enhanced to deliver extreme brightness in the highlights, natural mid tones, and deep shadow detail.

HDR consumptionThey had to sit through 'Exodus' to do this graph. (Photo: NRDC)

The study tested one of the few films available in this format ("Exodus: Gods and King" — it’s a tough job but somebody has to do it) and found that it used a whopping 47 percent more energy than the regular 4K version.

What to do?

Big 4K screens are where the market is going, and HDR will also become more popular. Consumers can reduce the energy load through smart purchasing and management:

  • Buy Energy Star-rated models
  • Make sure Automatic Brightness Control is on. This adjusts brightness according to how much ambient light is on in the room and can reduce consumption by up to 50 percent.
  • Disable the quick-start mode and reduce standby power losses. It won’t kill you to wait a minute.
  • It wouldn't hurt to use it in moderation.

If you find one of these under your tree, you're in for a treat; the jump in quality between 1080P of the current flat screens and the new 4K is really amazing. They are sharp and bright, even without the HDR. But set it up right; some of them use as much electricity as your fridge. Think of your footprint, and your energy bill.

You can download the whole study here from the NRDC.

Lloyd Alter ( @lloydalter ) writes about smart (and dumb) tech with a side of design and a dash of boomer angst.