• Night Monkey
    link
    fedilink
    617 months ago

    I’m so sick of Nvidia’s bullshit. My next system will be AMD just out of spite. That’s goes for processors as well

    • @kureta@lemmy.ml
      link
      fedilink
      167 months ago

      only thing keeping me is CUDA and there’s no replacement for it. I know AMD has I-forgot-what-it’s-called but it is not a realistic option for many machine learning tasks.

    • Dojan
      link
      fedilink
      147 months ago

      I went with an AM5 and an Intel Arc GPU. Quite satisfied, the GPU is doing great and didn’t cost an arm and a leg.

      • Nanomerce
        link
        fedilink
        57 months ago

        How is the stability in modern games? I know the drivers are way better now but more samples is always great.

        • Dojan
          link
          fedilink
          67 months ago

          Like, new releases? I don’t really play many new games.

          Had Baldur’s Gate III crash once, and that’s the newest title I’ve played.

          Other than that I play Final Fantasy XIV, Guild Wars 2, The Sims and Elden Ring, never had any issues.

    • @Vinny_93@lemmy.world
      link
      fedilink
      67 months ago

      Considering the price of a 4070 vs the 7800XT, the 4070 makes a lot more sense where I live.

      But yes, the way AMD makes their software open to use (FSR, FreeSync) and they put DisplayPort 2.1 on their cards, they create a lot of goodwill for me.

    • @Cagi@lemmy.ca
      link
      fedilink
      3
      edit-2
      7 months ago

      The only thing giving me pause about ATI cards is their ray tracing is allegedly visibly worse. They say next gen will be much better, but we shall see. I love my current non ray tracing card, an rx590, but she’s getting a bit long in the tooth for some games.

        • @Cagi@lemmy.ca
          link
          fedilink
          217 months ago

          Not since, oh before most of Lemmy was born. I’m old enough to remember when Nvidia were the anti-monopoly good guys fighting the evil Voodoo stranglehold on the industry. You either die a hero or you live long enough to see yourself become the villain.

          • @PenguinTD@lemmy.ca
            link
            fedilink
            47 months ago

            yeah, that’s pretty much why I stopped buying Nvidia after GTX 1080. Cuda was bad in terms of their practice, but not that impactful since OpenCL etc can still tune and work properly with similar performance, just software developer/researcher love free support/R&D/money to progress their goal. They are willing to be the minions which I can’t ask them to not take the free money. But RTX and then tensor core is where I draw the line, since their patent and implementation will have actual harm in computer graphic and AI research space but I guess it was a bit too late. We are already seeing the results and Nvidia is making banks with that advantage. They are essentially just applying the Intel playbook but doing it slightly different, they don’t buy the OEM vendors, they “invest” software developers/researcher to use their closed tech. Now everyone is paying the premium if you buy RTX/AI chips from Nvidia and the capital boom from AI will make the gap hard to close for AMD. After all, R&D requires lots of money.

        • be_excellent_to_each_other
          link
          fedilink
          67 months ago

          I have to admit I still tend to call them that, too. Oldttimers I guess.

          The first GPU I remember being excited to pop into my computer and run was a Matrox G400 Max. Damn I’m old.

          • @Cagi@lemmy.ca
            link
            fedilink
            77 months ago

            I would have been so jealous. Being able to click “3d acceleration” felt so good when I finally upgraded. But I was 12, so my dad was in charge of pc parts. Luckily he was kind of techy, so we got there. Being able to run Jedi Knight: Dark Forces II with max settings is a day I’ll never forget for some reason, lol.