Hello. I know this isn’t completely related to Linux, but I was still curious about it.

I’ve been looking at Linux laptops and one that caught my eye from Tuxedo had 13 hours of battery life on idle, or 9 hours of browsing the web. The thing is, that device had a 3k display.

My question is, as someone used to 1080p and someone that always tries to maximise the battery life out of a laptop, would downscaling the display be helpful? And if so, is it even worth it, or are the benefits too small to notice?

  • bloodfart@lemmy.ml
    link
    fedilink
    arrow-up
    2
    ·
    edit-2
    2 months ago

    Short answer: no.

    Long answer: also no, but in some specific circumstances yes.

    Your display uses energy to do two things, change the color you see and make them brighter or dimmer. It honestly speaking has a little processor in it but that sucker is so tiny and energy efficient that it’s not affecting things much and you can’t affect it anyway.

    There’s two ways to do the things your display does, one way is to have a layer of tiny shutters that open up when energized and allow light through their red, blue or green tinted windows in front of a light source. In this case you can use two techniques to reduce the energy consumption: open fewer shutters or reduce the intensity of the light source. Opening fewer shutters seems like it would be part of lowering the resolution, it when you lower the resolution you just get more shutters open for one logical “pixel” in the framebuffer (more on that later).

    Another way to do what your display does is to have a variable light source behind each tinted window and send more or less luminance through each one. In this case there is really only one technique you can use to reduce the energy consumption of the display, and that’s turning down the brightness. This technique has the same effect as before when you lower the resolution. It’s worth noting that a “darker” displayed image will consume less energy in this case, so if you have an oled display, consider using a dark theme!

    So the display itself shouldn’t save energy with a lowered resolution.

    Your gpu has a framebuffer, which is some memory that corresponds to the display frame. If that display is running at a lower resolution, the framebuffer will be smaller and if it it’s running at higher resolution it’ll be bigger. Memory is pretty energy efficient nowadays, so the effect of a larger framebuffer on energy consumption is negligible.

    Depending on your refresh rate, the framebuffer gets updated some number of times a second. But the gpu doesn’t just completely wipe and rewrite and resend the framebuffer, it just changes stuff that needs it, so when you move your mouse at superhuman speeds exactly one cursor width to the left in one sixtieth of a second, the framebuffer updates two cursor area locations in the framebuffer, the place the cursor was gets updated to reflect whatever was underneath and the place the cursor is gets updated with a cursor on it.

    Okay but what if I’m doing something that changes the whole screen at my refresh rate? In that case the whole framebuffer gets updated!

    But that doesn’t often happen…

    Let’s say you’re watching a movie. It’s 60fps source material, so wouldn’t the framebuffer be updating 60 times a second? No! Not only is the video itself encoded to reflect that colors don’t change from frame to frame and that the thing decoding them doesn’t need to worry about those parts, the thing decoding them is actively looking for even more ways to avoid doing the work of changing parts of the framebuffer.

    So the effect of a larger framebuffer on battery is minimized while playing movies, even when the frame buffer is huge!

    But actually decoding a 3k movie is much more cpu intensive than 1080. So maybe watch in 1080, but that’s not your display or resolution, it’s the resolution of the source material.

    Okay, but what about games? Games use the framebuffer too, but because they aren’t pre-encoded, they can’t take advantage of someone having already done the work of figuring out what parts are gonna change and what parts are. So you pop into e1m1 and the only way the computer can avoid updating the whole framebuffer is when the stuff chocolate doom sends it doesn’t change the whole framebuffer, like those imps marching in place.

    But chocolate doom still renders the whole scene, making use of computer resources to calculate and draw the frame and send it to the framebuffer which looks up and says “you did all this work to show me imp arms swinging over a one inch square portion of screen area”?

    But once again, chocolate doom takes more computer resources to render a 3k e1m1 than one in 1080, so maybe turn down your game resolution to save that energy.

    Hold on, what about that little processor on the display? Well it can do lots of stuff but most of the time it’s doing scaling calculations so that when you run chocolate doom full screen at 1080 the image is accurately and as nicely as possible scaled across the whole screen instead of stuck at the top left or in the middle or something. So in that case you could actually make that little sucker do less work and take up less energy by running at the displays “native” resolution than if you were at 1080.

    So when jigsaw traps you in his airport terminal shaped funhouse and you wake up with the exploder on your neck and a note in front of you that says “kill carmack” and no charger in your bag, yes, you will save energy running at a lower resolution.

    E: running chocolate doom at a lower resolution, not the display.

    • averyminya@beehaw.org
      link
      fedilink
      arrow-up
      1
      ·
      2 months ago

      Color change, eh? Sounds like B+W makes displays more energy efficient, that should be significant!

      (\s)