I have this question. I see people, with some frequency, sugar coating the Nvidia GPU marriage with Linux. I get that if you already have a Nvidia GPU or you need CUDA or work with AI and want to use Linux that is possible. Nevertheless, this still a very questionable relationship.

Shouldn’t we be raising awareness about in case one plan to game titles that uses DX12? I mean 15% to 30% performance loss using Nvidia compared to Windows, over 5% to 15% and some times same performance or better using AMD isn’t something to be alerting others?

I know we wanna get more people on Linux, and NVIDIA’s getting better, but don’t we need some real talk about this? Or is there some secret plan to scare people away from Linux that I missed?

Am I misinformed? Is there some strong reason to buy a Nvidia GPU if your focus is gaming in Linux?

Edit: I’m adding some links with the issue in question because I see some comments talking about Nvidia to be working flawless:

https://forums.developer.nvidia.com/t/directx12-performance-is-terrible-on-linux/303207

https://www.reddit.com/r/linux_gaming/comments/1nr4tva/does_the_nvidia_dx12_bug_20ish_performance_loss/

Please let me know if this is already fixed on Nvidia GPUs for gaming in Linux.

  • daggermoon@lemmy.world
    link
    fedilink
    arrow-up
    7
    arrow-down
    1
    ·
    2 days ago

    If you want to use Linux, please choose AMD. I helped install CachyOS on my sister’s RTX 5080 system and its horrible. 40% performance loss. She’s going back to Windows.

    • Xirup@lemmy.dbzer0.com
      link
      fedilink
      arrow-up
      1
      ·
      23 hours ago

      I guess this is because the 5080 drivers are still fresh? I used to play on an 1650S and although the Wayland enviroment was the worst shit I had ever have experienced, gaming in general in X11 was normal.

      • daggermoon@lemmy.world
        link
        fedilink
        arrow-up
        1
        ·
        edit-2
        21 hours ago

        Gaming works fine now. The main issue is Plasma related. More accuratly Plasma + NVIDIA related.

    • daggermoon@lemmy.world
      link
      fedilink
      arrow-up
      2
      ·
      edit-2
      2 days ago

      Nevermind, she’s sticking with Linux. Tinkering with it actually fixed most of the major issues.

  • LeFantome@programming.dev
    link
    fedilink
    arrow-up
    8
    arrow-down
    2
    ·
    3 days ago

    Two pretty massive facts for anybody trying to answer this question:

    1. Since driver version 555, explicit sync has been supported. This makes a massive difference to the experience on Wayland. Most of the problems people report are for drivers earlier than this (eg. black screens and flicker).

    2. Since driver version 580, NVIDIA uses Open Source modules to interact with the kernel. These are not Open Source drivers. They are the proprietary drivers from NVIDIA that should now “just work” across kernel upgrades (like AMD has forever). This solves perhaps the biggest hassle of dealing with NVIDIA on Linux.

    Whether you get to enjoy these significant improvements depends on how long it takes stuff to make it to your distribution. If you are on Arch, you have this stuff today. If you are on Debian, you are still waiting (even on Debian 13).

    This is not an endorsement of either distro. They are simply examples of the two extremes regarding how current the software versions are in those distros. Most other distros fall somewhere in the middle.

    All this stuff will make it to all Linux users eventually. They are solved problems. Just not solved for everyone.

  • Ardens@lemmy.ml
    link
    fedilink
    arrow-up
    36
    ·
    4 days ago

    I use AMD, where ever it is possible. Simply because they support Linux. There’s really no other reason needed. I don’t care about CUDA or anything else, that is vaguely not relevant. I’d rather drive a medium car, that gives me freedom, than a high end car, that ties me down.

  • melfie@lemy.lol
    link
    fedilink
    arrow-up
    5
    ·
    edit-2
    3 days ago

    NVIDIA definitely dominates for specialized workloads. Look at these Blender rendering benchmarks and notice AMD doesn’t appear until page 3. Wish there were an alternative to NVIDIA Optix that were as fast for path tracing, but there unfortunately is not. Buy an AMD card if you’re just gaming, but you’re unfortunately stuck with NVIDIA if you want to do path traced rendering cost effectively:

    https://opendata.blender.org/benchmarks/query/?compute_type=OPTIX&compute_type=CUDA&compute_type=HIP&compute_type=METAL&compute_type=ONEAPI&group_by=device_name&blender_version=4.5.0

    Edit:

    Here’s hoping AMD makes it to the first page with next generation hardware like Radiance Cores:

    https://wccftech.com/amd-unveils-radiance-cores-neural-arrays-universal-compression-next-gen-rdna-gpu-architecture/

  • mybuttnolie@sopuli.xyz
    link
    fedilink
    arrow-up
    33
    arrow-down
    1
    ·
    4 days ago

    yes, HDMI 2.1. if you use a tv as a monitor, you won’t get 4k120 with amd cards on linux because hdmi forum is assholes

    • wonderfulvoltaire@lemmy.world
      link
      fedilink
      arrow-up
      13
      arrow-down
      1
      ·
      4 days ago

      I have a 6900xt and it has output for 4k 120 and I never had issues with it on multiple distros. Lately Bazzite has been behaving as expected so I don’t know where this information is coming from besides the argument that HDMI is closed source as opposed to DisplayPort.

      • mybuttnolie@sopuli.xyz
        link
        fedilink
        arrow-up
        10
        ·
        4 days ago

        hdmi 2.0 doesn’t have the bandwidth for 4k120, displayport and hdmi 2.1 do. amd drivers don’t have hdmi 2.1 driver, because the hdmi forum didn’t allow amd to use it in their open source linux driver. you still get 4k120 with dp and even on hdmi if you use limited colorspace

  • megopie@lemmy.blahaj.zone
    link
    fedilink
    English
    arrow-up
    40
    ·
    4 days ago

    I’d say in general, the advantages of Nvidia cards are fairly niche even on windows. Like, multi frame generation (fake frames) and upscaling are kind of questionable in terms of value add most of the time, and most people probably aren’t going to be doing any ML stuff on their computer.

    AMD in general offers better performance for the money, and that’s doubly so with Nvidia’s lackluster Linux support. AMD has put the work in to get their hardware running well on Linux, both in terms of work from their own team and being collaborative with the open source community.

    I can see why some people would choose Nvidia cards, but I think, even on windows, a lot of people who buy them probably would have been better off with AMD. And outside of some fringe edge cases, there is no good reason to choose them when building or buying a computer you intend to mainly run Linux on.

    • filister@lemmy.world
      link
      fedilink
      arrow-up
      10
      arrow-down
      2
      ·
      4 days ago

      Even though I hate Nvidia, they have a couple of advantages:

      • CUDA
      • Productivity
      • Their cards retain higher resale values

      So if you need this card for productivity and not only gaming, Nvidia is probably better, if you buy second hand or strictly for gaming, AMD is better.

      • megopie@lemmy.blahaj.zone
        link
        fedilink
        English
        arrow-up
        5
        arrow-down
        2
        ·
        edit-2
        4 days ago

        It depends on the type of productivity TBH. Like, sure some productivity use cases need CUDA, but a lot of productivity use cases are just using the cards as graphics cards. The places where you need CUDA are real, but not ubiquitous.

        And “this is my personal computer I play games on, but also the computer I do work on, and that work needs CUDA specifically” is very much an edge case.

        • filister@lemmy.world
          link
          fedilink
          arrow-up
          2
          arrow-down
          1
          ·
          4 days ago

          As far as I am aware they are also better at video encoding and if you want to use Blender or similar software, yes, it is niche, but a credible consideration. As always, it really depends on the use case.

          • reliv3@lemmy.world
            link
            fedilink
            arrow-up
            6
            arrow-down
            1
            ·
            edit-2
            4 days ago

            Blender can be CUDA accelerated which does give Nvidia an edge over AMD. In terms of video encoding, both nvidia and AMD cards are AV1 capable, so they are on par for video encoding; unless a program does not support AV1, then the proprietary nvidia video encoders are better.

    • Mihies@programming.dev
      link
      fedilink
      arrow-up
      3
      arrow-down
      6
      ·
      edit-2
      4 days ago

      From just hardware perspective, Nvidia cards are more energy efficient.

      Edit: I stand corrected, series 9070 is much more energy efficient.

      • iopq@lemmy.world
        link
        fedilink
        arrow-up
        8
        ·
        edit-2
        4 days ago

        That’s not quite true. AMD cards just get clocked higher from the factory. So when a 9070xt beats a 5070 by an average of 17%, you can easily cap the power limit to match the performance. That’s with more VRAM which of course increases the power requirements

        The prices don’t quite match up, though since it’s between the 5070 and the ti (although in the US it’s often more expensive for some reason)

        The problem is that AMD is selling the chips to OEMs for a price that’s too high to enable to sell at MSRP while giving a discount for small batches of MSRP models. It becomes a lottery where the quickest people can get $600 models refreshing ever rarer restocks.

        One of the reasons is… tariffs, but I’m not sure how Nvidia got the prices down on its models

  • just_another_person@lemmy.world
    link
    fedilink
    arrow-up
    17
    ·
    4 days ago

    AMD will have superior support and better power management out of the box hands down.

    Nvidia may have a minor performance improvement in some areas depending on the card, but not in a way you would care if you aren’t obsessed with the technical specifics of the graphics on AAA games.

    I’ve been on Linux as a dev and daily driver for 20 years, and Nvidia drivers are just problematic unless you know exactly how to fix them when there are issues. That’s an Nvidia problem, not a Linux problem. Cuda on AMD is also a thing if you want to go that route.

    The choice is yours.

    • vinnymac@lemmy.world
      link
      fedilink
      arrow-up
      2
      ·
      4 days ago

      I’m glad you mentioned knowing how to fix them. My server has hosted Nvidia GPUs for 15 odd years now, working great, and has remained stable through updates by some miracle.

      Getting it set up was a nightmare back then though, do not recommend for the faint of heart.

  • LeFantome@programming.dev
    link
    fedilink
    arrow-up
    16
    arrow-down
    3
    ·
    4 days ago

    I think the answer is if you are shooting for the high-end. AMD is better cost / performance but NVIDIA is still unchallenged for absolute performance if budget is not a consideration.

    And if you need CUDA…

    • typhoon@lemmy.worldOP
      link
      fedilink
      arrow-up
      5
      arrow-down
      1
      ·
      edit-2
      3 days ago

      I agree with that, because there is no offering from AMD to compete with the high-end Nvidia absolute performance GPU.

  • Lukemaster69@lemmy.caB
    link
    fedilink
    arrow-up
    12
    arrow-down
    1
    ·
    4 days ago

    it is better to go with AMD because AMD drivers are built into the iso and less headache for gaming

  • Admetus@sopuli.xyz
    link
    fedilink
    arrow-up
    4
    ·
    4 days ago

    I only play older games, opensource games (like Pioneer Space Sim, Luanti), and emulate PS2 mostly (could do PS3/4 you bet) so AMD is fine for my use case and works out of the box. I know Nvidia Linux support has improved which means the latest graphics cards also pretty much work out of the box too. But by principle, I support AMD for the work they put into working on Linux.

  • reagansrottencorpse@lemmy.ml
    link
    fedilink
    arrow-up
    3
    arrow-down
    1
    ·
    3 days ago

    Im putting together my new Nvidia PC build tonight. I was planning on putting bazzite on it, should I just use windows then?

    • prole@lemmy.blahaj.zone
      link
      fedilink
      arrow-up
      4
      ·
      3 days ago

      No, you should at least try Bazzite first. I’ve seen people recently talking about how they have no issues with Nvidia and Linux.

    • typhoon@lemmy.worldOP
      link
      fedilink
      arrow-up
      3
      ·
      edit-2
      3 days ago

      I’d go with Linux, no matter what, but this seem exactly why I feel that we should be more clear. People may be building some PCs out there to use Linux for gaming and buying Nvidia because others keep saying that everything is smooth sails with Nvidia. A lot of it is working now but there are some downsides and the recommendation is to go with AMD if you can.

    • Turtle@aussie.zone
      link
      fedilink
      arrow-up
      4
      arrow-down
      3
      ·
      3 days ago

      Nvidia cards work just fine on Linux, old issues are parroted around by people who don’t know any better.

      • cevn@lemmy.world
        link
        fedilink
        arrow-up
        2
        ·
        2 days ago

        My nvidia 1080 just failed driver upgrades on the most recent edition of fedora. Cant parrot myself…

      • Z3k3@lemmy.world
        link
        fedilink
        English
        arrow-up
        3
        ·
        3 days ago

        I had to roll back my kernel when deb flung out the last one due to drivers. They updated them yet as I haven’t noticed them in updates

        • LeFantome@programming.dev
          link
          fedilink
          arrow-up
          2
          arrow-down
          1
          ·
          3 days ago

          For anybody tying to make sense of who has trouble on NVIDIA, keep in mind that Debian uses ancient drivers.

          Thankfully Debian Stable updated recently so it has gotten a lot better but, the last I checked, the Debian drivers still did not support explicit sync. This could lead to problems on Wayland.

          Remember that “stable” in Debian means that your system will not change much. That is often a good thing but it can often mean progress comes to Debian much later than other distros.

          Never use Debian as a benchmark for what works on Linux.

          • Z3k3@lemmy.world
            link
            fedilink
            English
            arrow-up
            3
            ·
            2 days ago

            Heh you described my problem exactly. Waylaid did get busted in my attempt to get it working again. I was hoping that given the kernel update broke nvidia they would push a fixing update.

            Tbh. I have been experimenting for a while to sid what I need to keep win for and am happy enough to make linux my big partition now. Given the nvidia drivers issue on the new kernel im tempted to look elsewhere