• AMillionNames@sh.itjust.works
    link
    fedilink
    English
    arrow-up
    5
    ·
    edit-2
    7 months ago

    The 4090 isn’t that much better than the 4080, and the 4080 is much more energy efficient. I guess it all depends on what you consider “blowing previous gens out of the water” but that’s sort of marketing non-sense.

    If you have a 3090Ti you aren’t going to be missing out anything on current gen games, and current gen developers aren’t in a hurry to develop exclusively for hardware most of their potential consumers aren’t going to be running on. So basically, it’s just “blowing previous gens out of the water” on flair at the moment.

    • Sentau@feddit.de
      link
      fedilink
      English
      arrow-up
      4
      ·
      edit-2
      7 months ago

      The 4090 isn’t that much better than the 4080, and the 4080 is much more energy efficient

      The 4090 often beats the 4080 with a margin of 25% or more often has a one percent low figure better than the average frame rates delivered by the 4080. If that is not enough for it to be considered much better, I don’t know what is.

      Regarding your comments on efficiency, generally the high end is not as efficient as the other GPUs as they are pushed beyond the ideal point of the voltage-frequency curve to extract the most performance. And even after this, in GPU bound scenarios, the 4090 does great as it uses 25-35% more power to deliver 20-30% better fps which puts it on average around 10-15% behind in efficiency which is not enough in my opinion to claim that the 4080 is much more efficient.

      Article based on hardware unboxed’s review - https://www.techspot.com/review/2569-nvidia-geforce-rtx-4080/

      • AMillionNames@sh.itjust.works
        link
        fedilink
        English
        arrow-up
        1
        ·
        edit-2
        7 months ago

        I guess I just don’t see a 25% as that much better when I’m pleased with my 4K setup and am already getting 120+ fps on most games when the card I’m using isn’t even current gen.

        That 25-35% has been enough to cause a significant number of 4090 fatalities from loose contacts due to microdebris in the power connections. You aren’t just paying around 50% more for that 25%, you are paying for added risk, the higher power consumption, and the higher power capacity PSUs to match, and as the PC ages it’s going to wear out sooner as well. All for what’s really an RTX selling point that barely any game dev uses on a generation that all major coverage has criticized as being particularly expensive.

        I frankly still believe that if there was any generation to skip on launch, it’s this one. Intel is slowly but surely joining the GPU market, if on the low-end, AMD cards are competing with NVIDIA on where it matters, and for all the threats NVIDIA has made about dropping out because of their AI nest egg, it knows it will need to keep the reigns over the 5000 series if it doesn’t want to get sidelined for good, and it will do so with all the experience that it has had about how overpricing this generation may have dropped their sales significantly.

    • PrincessEli@reddthat.com
      link
      fedilink
      English
      arrow-up
      4
      arrow-down
      1
      ·
      7 months ago

      So basically, it’s just “blowing previous gens out of the water” on flair at the moment.

      Barring major changes in how things are done like rasterization to raytracing, the top end of the GPU market has always been about flair, as far as gaming goes, no?