• Viri4thus@feddit.org
    link
    fedilink
    English
    arrow-up
    34
    ·
    10 days ago

    Getting ready to “motivate” people to get the 5xxx series because the current cards “have issues now”. The more you buy the more you save!

    • Lemminary@lemmy.world
      link
      fedilink
      English
      arrow-up
      1
      arrow-down
      1
      ·
      9 days ago

      I thought I was happy I went AMD until my card started overrunning its fans for no reason a month after the warranty ran out. I manually had to reseat the card on the PCIe for it to stop because nothing else would, not even restarting the PC. And then one day it heated up so bad it stopped working. I think they gave me a defective card on purpose because people are less likely to return the items when they’re buying from outside the US.

      I’ve since gone back to Nvidia and my current card hasn’t given me any issues. What a nightmare that was.

      • OrderedChaos@lemmy.world
        link
        fedilink
        English
        arrow-up
        3
        ·
        8 days ago

        I swear that in my 20+ years of computer work that everyone has a story like this for every brand out there. It seems to literally be bad luck. That being said some companies just have abysmal and evil support ethics. And these days it seems all of them are trying to dial in the device failure to happen after the warranty expires.

          • OrderedChaos@lemmy.world
            link
            fedilink
            English
            arrow-up
            1
            ·
            8 days ago

            I think that can be true in many situations. I have had sincere failures that on the surface sound like incompetence. It is possible for things to fail so spectacularly it sounds like fiction.

  • caut_R@lemmy.world
    link
    fedilink
    English
    arrow-up
    18
    ·
    edit-2
    10 days ago

    That‘s certainly something they‘re gonna want to fix. I hope DF and GN pick up on this, seems like free views and I‘d love to hear what they‘ve got to say on the matter.

    Edit: Also wondering if it‘s the app or if the performance hit disappears when you disable the overlay. Only flew over the article to see what games are affected how badly so mb if that’s mentioned.

    Edit 2:

    HUB‘s Tim tested it and found that it‘s the overlay or rather the game filter portion of the overlay causing the performance hit. You can disable this part of the overlay in the app‘s settings, or disable the overlay altogether.

    He also found that this feature wasn’t impacting performance on GeForce Experience, so it’s very likely a bug that’s gonna be fixed.

    To clarify: Using game filters actively can have an impact on either, but right now even when not actively using them, they cause a performance hit just by the functionality being enabled; a bug.

    The only outlier where just having the app installed hit performance was the Harry Potter game.

    • eramseth@lemmy.world
      link
      fedilink
      English
      arrow-up
      1
      arrow-down
      6
      ·
      10 days ago

      Yeah they didn’t test that. Nor did they test having the app installed but not running. Crummy article tbh.

      • ArbiterXero@lemmy.world
        link
        fedilink
        English
        arrow-up
        10
        ·
        10 days ago

        Disagree, and i don’t think it’s the point.

        As an average user, why am I paying a performance hit for nvidia’s own “recommended parameters”

        That’s trash and a terrible experience, and they should be called out for it.

        • eramseth@lemmy.world
          link
          fedilink
          English
          arrow-up
          2
          ·
          10 days ago

          I don’t think you’re understanding. The testing they did was presumably fine and the performance hit is probably unacceptable. But mentioning but not testing the scenarios of

          • app installed but not running
          • app installed and running but overlay turned off

          Is kinda mailing it in.

          • ArbiterXero@lemmy.world
            link
            fedilink
            English
            arrow-up
            2
            ·
            10 days ago

            I’ll give you that, yep, sure.

            But that doesn’t invalidate the data they did get, it’s just not a full picture.

    • Vik@lemmy.world
      link
      fedilink
      English
      arrow-up
      5
      ·
      edit-2
      10 days ago

      Yep, uses CEF, though many popular desktop apps do without much perf impact.

      • rdri@lemmy.world
        link
        fedilink
        English
        arrow-up
        12
        arrow-down
        1
        ·
        10 days ago

        It’s not CEF that does most of the impact. It’s the contents web devs make it load and process. And web devs generally not being very competent in optimizing is just a sad reality.

        • merthyr1831@lemmy.ml
          link
          fedilink
          English
          arrow-up
          1
          ·
          8 days ago

          web Devs aren’t ignorant to optimizing but the kind of interfaces used in web are very different to that of desktop. Cross platform technologies can work, but anything built on top of web engines is going to be a little dogshit on native platforms.

          Web tech was designed around the asynchronous and comparatively slow nature of the network. Now, those same layout and rendering engines are being shoehorned into an environment where the “server” is your local disk so it’s suddenly doing a bunch of work that was intended to be done iteratively.

          Same goes the other way of course. Software designed for “native first” experiences like Flutter aren’t as popular in web dev because they work on that same, but reversed, assumption of a local disk being your source.

          It would be like wondering why physical game disks aren’t popular on PC - it’s a fundamentally different technology for fundamentally different expectations and needs.

          • rdri@lemmy.world
            link
            fedilink
            English
            arrow-up
            1
            ·
            8 days ago

            but anything built on top of web engines is going to be a little dogshit on native platforms.

            Hard disagree on “little”.

            Software designed for “native first” experiences like Flutter aren’t as popular in web dev because they work on that same, but reversed, assumption of a local disk being your source.

            Popularity should not be dictated by what web devs prefer. As long as they build for desktop, I won’t pardon excessive resource usage. And I’m not talking about Flutter. Better performance oriented frameworks exist, see sciter.

  • Katana314@lemmy.world
    link
    fedilink
    English
    arrow-up
    3
    ·
    10 days ago

    I used to only use this for game recording. But, it got a glitch where games record with a red tint ever since I upgraded my monitor. Thankfully, every single gaming helper app seems to feature recording now, so I just switched to another.

  • merthyr1831@lemmy.ml
    link
    fedilink
    English
    arrow-up
    3
    arrow-down
    1
    ·
    8 days ago

    I know people complain about Nvidia and Linux but one of the best parts of my experience with it was never having to deal with GFE. Just a bunch of project managers trying to make themselves useful by shovelling needless slop into your GPU driver.