• LordOfTheChia@lemmy.world
      link
      fedilink
      arrow-up
      15
      ·
      edit-2
      9 months ago

      I would check two things:

      1. Is it a QLED tv? Those are very efficient with the backlight power. QLED only have a blue led backlight and the “quantum dots” in the panel between the backlight and the LCD panel absorb the blue light and emit the red green and blue needed to create the full color spectrum.

      2. How many nits of brightness does it produce? I’d check for the specific model on RTINGS. It won’t help OP much if the TV is efficient, but so dim that it’s unusable in their case.

      Reflectivity also helps with brightness when viewed in a bright room. The less reflective (matte) the less brightness the TV needs to overcome distracting light sources reflecting on the screen.

      Edit: Had to look it up to be sure, normal LED panels use filters that filter red, green, and blue light from a white light source. This means roughly 1/3 of the light from the backlight is filtered away, hence the energy inefficiency vs QLED which uses the energy from the blue light to create the colors.

      Intestingly, some DLP projectors use alternating red, green, and blue light sources which strobe on the DLP chip which takes turns modulating the intensity of each color. Less efficient (and bright) DLPs use a single white light source and a color wheel (rotating color filter).

    • givesomefucks@lemmy.world
      link
      fedilink
      English
      arrow-up
      4
      ·
      9 months ago

      It’s also like how cars calculate mileage.

      Got brightness down as far as it goes and volume basically muted.

      So could it be rated that low? Sure.

      But would it actually only use that much? Nope.

      • lurch (he/him)@sh.itjust.works
        link
        fedilink
        arrow-up
        5
        ·
        9 months ago

        Mine warns me when I turn brightness up etc, that this will use more power. Have to click OK on it. Was pretty annoying in the first days while getting the settings right for me.

  • Mr_Blott@lemmy.world
    link
    fedilink
    arrow-up
    15
    ·
    9 months ago

    I’m guessing it’s an EU model. They have all sorts of “eco” modes to pass environmental laws, but you wouldn’t use them IRL

    So yes, it could, but fuck that, stick it on dynamic HDR and drive your eco friendly -ish car to compensate lol

    • fuckwit_mcbumcrumble@lemmy.world
      link
      fedilink
      English
      arrow-up
      5
      ·
      9 months ago

      That was my first guess as well. By default the TV is in an eco mode and would only use around 50 watts. But as soon as you make the TV actually usable it will double in power.

      OP if you want to get a worst cast scenario of the power consumption of your TV just look at the power supply. If it’s an external brick just look at the DC output from the brick and multiply the voltage by the amperage. If you’re running it off of a battery powered inverter then power factor, and efficiency of the brick come into play, but it shouldn’t be too much worse than the absolute highest the brick is rated to output.

    • counselwolf@lemmy.worldOP
      link
      fedilink
      arrow-up
      1
      ·
      9 months ago

      Is it possible that the local version of Energy Star for my TV used the Eco mode setting for the tests?

      • AggressivelyPassive@feddit.de
        link
        fedilink
        arrow-up
        2
        arrow-down
        1
        ·
        9 months ago

        They usually test whatever the manufacturer says is the default. And that most likely happens to be the lowest power mode that barely resembles a reasonable usage.

  • lemmefixdat4u@lemmy.world
    link
    fedilink
    arrow-up
    10
    ·
    9 months ago

    It could, probably at the lowest brightness setting. If it was an OLED TV it could use under 10 watts while displaying a black picture. An LCD TV would be measured at the lowest backlight brightness. So YMMV, depending on how dim a picture you’ll settle for.

    TV tech has come a long way though. My old 25" CRT TV choked down 240W. The 70" LCD currently on the wall does about 90W. And the 27" TV in my office setup sips 15W.

    • Buffalox@lemmy.world
      link
      fedilink
      arrow-up
      3
      ·
      edit-2
      9 months ago

      By your numbers it should be possible, 55" is a little less than 4 times the area of a 27", which you say can run at 15 Watt. 4 times that is 60 Watt, so absolutely within a reasonable enough range.

  • Montagge@kbin.earth
    link
    fedilink
    arrow-up
    7
    ·
    9 months ago

    That’s about 0.5A on 120VAC so I would believe it on a modern low end TV. It sounds a little generous. but not grossly so.

  • binomialchicken@lemmy.blahaj.zone
    link
    fedilink
    arrow-up
    7
    ·
    9 months ago

    For about $12 USD, you can buy an electricity usage monitor and see in real time how much power it is using. Kill-A-Watt brand used to be cheap, but the clones are just as good now.

  • Buffalox@lemmy.world
    link
    fedilink
    arrow-up
    5
    ·
    edit-2
    9 months ago

    I just checked here, and the best i can find quickly are rated at 64 Watt, in what they call SDR mode, which stands for standard dynamic range. They use almost twice that with HDR or High dynamic range.
    These are rated E with the European energy mark. If you can find a TV rated A to C it will be better.
    The 50 Watt may be similar, but just by a different standard of measurement.

  • DeltaTangoLima@reddrefuge.com
    link
    fedilink
    English
    arrow-up
    2
    ·
    edit-2
    9 months ago

    My 2021 Samsung QLED 65" consumes a little over 100W, manufacturer’s spec says typical is 117W. My 2018 Sony LCD 65" consumes around 115W, manufacturer’s spec says 171W. I’m on 230VAC.