• brisk@aussie.zone
    link
    fedilink
    arrow-up
    133
    arrow-down
    1
    ·
    10 months ago

    “You may not reverse engineer, decompile or disassemble any portion of the output generated using SDK elements for the purpose of translating such output artifacts to target a non-NVIDIA platform.,”

    This is literally a protected right in multiple countries, so um…

    🖕😎🖕

    • TechNom (nobody)@programming.dev
      link
      fedilink
      English
      arrow-up
      44
      ·
      10 months ago

      Interesting that they started dictating what you can and can’t do with YOUR program! Consumer rights are a joke to these quasi-monopolies.

      • nilloc@discuss.tchncs.de
        link
        fedilink
        English
        arrow-up
        14
        ·
        edit-2
        10 months ago

        If the EU can fine Apple (formerly the most valuable company in the world), they better fine Nvidia (recently the most valuable company in world) let’s see if they are really valuable, or just propped up by AI hype after the crypto crash.

  • ninjan@lemmy.mildgrim.com
    link
    fedilink
    English
    arrow-up
    80
    arrow-down
    2
    ·
    edit-2
    10 months ago

    Jesus, they really are one of the most egregiously lock-in focused and monopolistic companies around. It saddens me deeply that consumers (gamers) just don’t give a flying fuck about this and continues to pay a premium for Nvidia cards. 90% market dominance in gaming and probably at least that in GPGPU workloads.

    All the while AMD tries to sell their cards on supporting / creating open standards like Freesync, FSR and Vulkan but because they don’t have CUDA (since it’s proprietary) they virtually can’t be bought by prosumers that want to do some GPGPU stuff as a hobby and gamers buy Nvidia for brand recognition, Ray tracing which they are stronger in (but I argue isn’t really all that outside a few notable exceptions like Alan Wake 2) and DLSS being ahead of FSR. But look at non-RT $/FPS and AMD wins easy at all price points and they don’t shaft the people who bought their cards by not giving them the new version of DLSS like Nvidia do. It’s just sad.

    Vote with your wallet they scream, while everyone votes for the alternative that openly wants to squeeze every penny out of them because they are slightly better…

    • popcar2@programming.dev
      link
      fedilink
      English
      arrow-up
      14
      arrow-down
      1
      ·
      edit-2
      10 months ago

      It saddens me deeply that consumers (gamers) just don’t give a flying fuck about this and continues to pay a premium for Nvidia cards.

      It doesn’t help that AMD isn’t competing that much price-wise. Their only saving grace is higher VRAM, and while that is nice, raw performance is becoming less relevant. FSR also does not compete with DLSS, it’s strictly worse in every way. They also barely exist in the laptop market, I was just considering buying a new gaming laptop and my options are an RTX 4060 or paying more for the one laptop with a weaker AMD GPU.

      I would argue Intel is shaping up to be the real competitor to Nvidia. They had a rough start but their GPUs are very price-competitive. Their newer integrated GPUs are also the best currently, they’re good for gen AI, their raytracing performance trumps AMD, and XeSS is a lot better than FSR. If I were in the market for a new GPU I’d probably grab the Intel A770. I’m looking forward to their next generation of cards.

      • ninjan@lemmy.mildgrim.com
        link
        fedilink
        English
        arrow-up
        12
        arrow-down
        1
        ·
        10 months ago

        Since XeSS can run on AMD cards I feel that point is a bit moot. Further the best Intel can offer (in discrete GPUs) is miles and miles behind AMD even. As for Price / Performance the 6600 XT is neck and neck with the ARC 770 at basically the same price, depending on card and the day. Where I’m at the 6600 XT is generally the cheaper one. And that’s not even talking about the 7600 XT which demolishes the ARC 770 at also the same price point…

        Nothing, rumor wise even, is indicating Intel will bring anything to the table to challenge 4070 or up.

        To sum it up in my opinion it really is only the ARC 380 that I’ve been impressed by. Very cheap card with excellent server performance for stuff like Jellyfin. But for gaming? No AMD is by far the better option from a value perspective.

        As for laptops it’s not that AMD doesn’t make the chips, the laptop makers know consumers want the Nvidia part.

    • ShadowRam@kbin.social
      link
      fedilink
      arrow-up
      4
      arrow-down
      4
      ·
      edit-2
      10 months ago

      Vote with your wallet

      I did.

      I bought my 3080 back in 2020, because I knew AI was the future of graphics, based on all the R&D and white papers nVidia was pumping out up to that point.

      No regrets.

      Not my problem nVidia was the only one to invest in the tech, while AMD relied solely on TSMC to shrink their dies.

      It has nothing to do with brand loyalty or fanboys or any of that shit.

      It’s just straight up better tech.

      • ninjan@lemmy.mildgrim.com
        link
        fedilink
        English
        arrow-up
        3
        ·
        10 months ago

        If you train AI models then you probably rely on CUDA and you’re really left without any meaningful choice. It also wouldn’t matter if AMD jumped 100% on AI even 5 years ago because CUDA has been so intensely adopted by the industry and AMD would need to do something completely novel and extremely impressive to have any chance of making a meaningful dent in just 5 years time.

        As such I don’t really blame you, as I said in my above post as well. I blame the gamers, the people that don’t use CUDA and just play video games, the people complaining about how expensive GPUs have become while still fucking buying nVidia cards. The fact that AMD can deliver a product that costs less at the same performance point (without RT) is pretty impressive given their miniscule volumes compared to nVidia.

  • tryptaminev 🇵🇸 🇺🇦 🇪🇺@feddit.de
    link
    fedilink
    arrow-up
    28
    arrow-down
    1
    ·
    10 months ago

    As AMD, Intel, Tenstorrent, and other companies develop better hardware, more software developers will be inclined to design for these platforms, and Nvidia’s CUDA dominance could ease over time. Furthermore, programs specifically developed and compiled for particular processors will inevitably work better than software run via translation layers, which means better competitive positioning for AMD, Intel, Tenstorrent, and others against Nvidia — if they can get software developers on board. GPGPU remains an important and highly competitive arena, and we’ll be keeping an eye on how the situation progresses in the future.

    I hope this plays out like this. Compete on the hardware and provide open source access to acceleration. If you provide the best value for the hardware, people will continue to buy it over your competitiors.

    • TechNom (nobody)@programming.dev
      link
      fedilink
      English
      arrow-up
      9
      arrow-down
      1
      ·
      edit-2
      10 months ago

      Peter Thiel is insolent enough to say out loud what these companies practice - ‘competition is for losers’. These quasi-monopolies aren’t here to provide the best value - quite the opposite. They want to kill all competition by any dirty tactic and then use the diminished choice to wring the customers of every penny they have. They want to extract maximum revenue by making sure that their inferior solution is the only option customers have.

      This problem isn’t solvable by market regulation alone. The world has enough a*****es around who will climb to the top of successful companies and find ways around the regulations. They’re being as bad as they can, while skirting the limits of what’s illegal. My main gripe is with the engineers, programmers, technicians and all technical creators who enable these scumbags. It’s not hard to see that supporting a proprietary solution amounts to yielding the consumers’ bargaining power to a monopoly. Despite that, they keep making these choices. For example, it’s not uncommon to hear senior engineering managers or technical-lead level employees saying, “I know that Chrome is spyware and I want to quit it. But this <stupid-webservice-at-office> works only on Chrome”. I feel like screaming at them that if they’re too incompetent to demand a change at the level they’re at, they’re in the wrong profession.

      If you’re a technical creator, your choices matter. It affects a lot more people than you alone. But more often than not, I see such creators surrendering principles in exchange for convenience. They hold as much responsibility as the market-abusers in making the world the way it is now.

  • RonSijm@programming.dev
    link
    fedilink
    arrow-up
    18
    ·
    10 months ago

    and, perhaps more critically, some Chinese GPU makers from utilizing CUDA code with translation layers.

    Like that ever deterred China from violating copyright claims to trademarks. Maybe if they’re huge companies that want to export, but if they’re just making in-country chips, especially if it’s useful for the Chinese government, these companies are not going to change anything based on some license warning

    • MyNamesNotRobert@lemmynsfw.com
      link
      fedilink
      arrow-up
      6
      ·
      edit-2
      10 months ago

      See I don’t want China to win, I just want evil corporations to lose. I hope they ramp up the “fuck your copyright” shenanigans more tbh. Really stick it to them as hard as possible. Is an evil corporation in another country stealing technology from a different evil corporation in a different country really the best way to fight this? No. But it still causes bad people to lose money which in the end is what matters.

  • mindbleach@sh.itjust.works
    link
    fedilink
    arrow-up
    17
    ·
    10 months ago

    CUDA was always nakedly anti-competitive posturing - like literally everything else Nvidia chucked into their GPUs - and now they’re saying the quiet part real fuckin’ loud.

    Hey, assholes! Turing completeness doesn’t give a shit about hardware. Computing is computing! You literally cannot tell people how to run your code. Congratulations on making your proprietary horseshit the de facto standard. By all means, enjoy the mountains of cash you’ve extracted via that abuse. But the rest of us have shit to do, and we don’t remember asking your permission.

    • TechNom (nobody)@programming.dev
      link
      fedilink
      English
      arrow-up
      33
      ·
      edit-2
      10 months ago

      CUDA is an API to run high performance compute code on Nvidia GPUs. CUDA is proprietary. So CUDA programs run only on Nvidia GPUs. Open alternatives like vulkan compute and opencl aren’t as popular as CUDA.

      Translation layers are interface software that allow CUDA programs to run on non-Nvidia GPUs. But creating such layers require a bit of reverse engineering of CUDA programs. But they are prohibiting this now. They want to ensure that all the CUDA programs in the world are limited to using Nvidia GPUs alone - classic vendor lock-in by using EULA.

      • Varyk@sh.itjust.works
        link
        fedilink
        arrow-up
        11
        ·
        10 months ago

        Thank you, that’s simply enough that I can understand what you’re saying, but complex enough that all of my questions are answered.

        Great answer

    • floofloof@lemmy.ca
      link
      fedilink
      English
      arrow-up
      12
      ·
      10 months ago

      CUDA is a system for programming GPUs (Graphics Processing Units), and it can be used to do far more computations in parallel than regular CPU programming could. In particular, it’s widely used in AI programming for machine learning. NVIDIA has quite a hold on this industry right now because CUDA has become a de facto standard, and as a result NVIDIA can price its graphics cards very high. Intel and AMD also make powerful GPUs that tend to be cheaper than NVIDIA’s, but they don’t natively support CUDA, which is proprietary to NVIDIA. A translation layer is a piece of software that interprets CUDA commands and translates them into commands for the underlying platform such as an AMD graphics card. So translation layers allow people to run CUDA software, such as machine learning software, on non-NVIDIA systems. NVIDIA has just changed its licence to prohibit this, so anyone using CUDA has to use a natively CUDA-capable machine, which means an NVIDIA one.

      • Varyk@sh.itjust.works
        link
        fedilink
        arrow-up
        3
        ·
        10 months ago

        Thank you, these are really great entry-level answer s so that I can understand what the heck is going on.

    • MxM111@kbin.social
      link
      fedilink
      arrow-up
      1
      ·
      10 months ago

      You can’t use CUDA drivers and then insert translation layer, that translates calls to NVIDIA hardware to calls to non-NVIDIA hardware and use non-NVIDIA hardware with CUDA.

      • Darkrai@kbin.social
        link
        fedilink
        arrow-up
        2
        ·
        10 months ago

        Do you think this is something the EU will say is anti-competitive or something? I don’t think current late-state capitalism America will do anything.

        • 520@kbin.social
          link
          fedilink
          arrow-up
          2
          ·
          10 months ago

          Oh the EU will definitely call this anticompetitive. Especially when nVidia have a monopoly in the AI segment as is.

  • SomeGuy69@lemmy.world
    link
    fedilink
    arrow-up
    12
    arrow-down
    3
    ·
    10 months ago

    I hate so much that I need CUDA so badly. Also for gaming raytracing is my jam, I pay a lot of money for high-end components. Fuck Nvidia with a burning stick, but I’m also not going to be the forefront, AMD has to deliver.

    • DacoTaco@lemmy.world
      link
      fedilink
      arrow-up
      10
      ·
      10 months ago

      I hate to say it to ya, but people are you are why they can get away with it. I feel your pain though, i couldnt imagine what would happen if vlc were to go this low. Id have no replacement i like as much as vlc hebe

      • SomeGuy69@lemmy.world
        link
        fedilink
        arrow-up
        4
        ·
        edit-2
        10 months ago

        Nvidia does make most of the money with professional GPUs sold to companies by now. Consumer sales with the RTX cards are only a small percentage to them. Boycott would actually do very little. I expect the next line of consumer GPUs to be even more expensive.

  • Transporter Room 3@startrek.website
    link
    fedilink
    arrow-up
    7
    ·
    10 months ago

    So I haven’t been keeping up with computer hardware stuff in quite some time, and I’ve actually been looking into getting a laptop for gaming. Yes I know, desktops are superior in every way, except the one that matters most to me. Portability.

    So really, what are my choices for non-nvidia devices? It seems like every laptop I see is geared for intel/Nvidia for c/gpu with only a couple offering amd/Nvidia instead.

    What are some good places to look for things other than Intel and nvidia?

      • Transporter Room 3@startrek.website
        link
        fedilink
        arrow-up
        3
        ·
        10 months ago

        Well that’s pretty awesome!

        Unfortunately also about $1000 more than what I was looking to spend, but still. Awesome anyway. Do they ever go on sale?

        I hope they’re still around next time I want to upgrade!

    • klangcola@reddthat.com
      link
      fedilink
      arrow-up
      3
      ·
      10 months ago

      I know your pain! (Cries in Nvidia laptop) when i bought mine i literally couldn’t find a laptop with AMD graphics in my region.

      There is some hope these days. In addition to the previously mentioned Frameworks laptop, there’s also this TUXEDO Sirius 16 - Gen1. (Tuxedo is a German company specializing in Linux-compatible computers). It might not be exactly what your looking for, but AMD graphics laptops are so few and far between I thought I should put it out there.

      • Transporter Room 3@startrek.website
        link
        fedilink
        arrow-up
        2
        ·
        10 months ago

        cries in 2015 acer aspire with GeForce 940m

        The price wasn’t as bad as I was expecting, even with the exchange rate.

        As much as I’m all for the “lol get Linux” memes, I don’t know the first thing about Linux, and prefer to debload windows. It sounds kind of dumb and circuitous, but literally everything I do is geared to windows. I wouldn’t even know where to start to move everything over to Linux/win11 vm (which I did notice they’ll preload for an extra 150 which is neat)

    • DarkGamer@kbin.social
      link
      fedilink
      arrow-up
      1
      arrow-down
      1
      ·
      10 months ago

      If I understand correctly, this would only affect you if you have non Nvidia hardware and wanted to use their software with it.

      • insomniac_lemon@kbin.social
        link
        fedilink
        arrow-up
        2
        ·
        10 months ago

        I would say the point is not wanting to buy from a company that’s clearly anti-consumer… particularly with CUDA not being new and then comparing it to something open and hardware-agnostic like FSR this headline also looks petty.

  • thesmokingman@programming.dev
    link
    fedilink
    arrow-up
    4
    ·
    10 months ago

    I feel like the Chinese government is probably the best defense here. If that project they’re supposedly sponsoring continues in spite of this, NVIDIA won’t do shit because they won’t want to lose that market. Just as long as that project is available to others, it’s a perfect sidestep.