• gnomesaiyan@lemmy.world
    link
    fedilink
    arrow-up
    83
    arrow-down
    2
    ·
    1 month ago

    I love how no one in the comments specifically mentions his name, like he’s fucking Voldemort or something.

    Jensen Huang!

    gasps

    • Zwiebel@feddit.org
      link
      fedilink
      English
      arrow-up
      75
      arrow-down
      2
      ·
      1 month ago

      CEO of a company that makes the computer part that is most important for gamers. He has a net worth of about 100 billion USD

      Their products are currently better than the competition and they make full use of their position.

      They are known for very high prices, scummy marketing tricks, and abusing their small business partners.

      The company briefly overtook Apple and Microsoft, becoming the most valuable company at around 3 trillion USD.

      • Tux@lemmy.worldOP
        link
        fedilink
        arrow-up
        60
        arrow-down
        4
        ·
        1 month ago

        The company briefly overtook Apple and Microsoft, becoming the most valuable company at around 3 trillion USD.

        Because of AI hype

        • ShadowRam@fedia.io
          link
          fedilink
          arrow-up
          35
          ·
          1 month ago

          It’s not all hype.

          nVidia has some SERIOUS R&D in the use of AI for the past 10 years.

          But using AI in the graphic space… upscaling, downscaling, faking lighting, faking physics… This is all very useful in making videogames.

          Then there was a leap in the way AI Image generation was done with the above hardware. And that opened up a whole new growing field.

          It’s just some people took basic language models that have been around for 30 years and scaled them up with their hardware. And it was neat, and surprising some of the stuff a LLM would output. But not reliable.

          And then suddenly a lot of layman’s got their hand on the LLM’s and thought it was the 2nd coming of Jesus, and started throwing big money at it… it will be surprise to no one who knows how these AI’s work that that big money isn’t going anywhere.

          But those first two, is no hype. It’s a real viable use case for the AI, and money will be made there.

          • DacoTaco@lemmy.world
            link
            fedilink
            arrow-up
            15
            arrow-down
            3
            ·
            1 month ago

            Youre missing a lot of events in that timeline tbh :p
            Nvidia forcing developers to use cuda enabled hardware, hard locking their tech to their hardware, the crypto boom of 2016 and 2020, …

            Theyve done a lot of stuff to gamers and datacenters over the past years that made them as powerful as they were when gpt3 hit the public eye.

            Me? Im stearing far far away from them. I dont support that business at all.

            • ShadowRam@fedia.io
              link
              fedilink
              arrow-up
              10
              arrow-down
              1
              ·
              1 month ago

              I’m not defending nVidia’s business practices at all.

              My point is the ‘AI’ hype isn’t hype.

              There’s real value added AI work being done outside of the ChatGPT LLM thing going on.

              • chaitae3@lemmy.world
                link
                fedilink
                arrow-up
                2
                ·
                1 month ago

                That’s fine but the money flows almost exclusively to the latter part, thus making it an economical bubble which will break soon

              • DacoTaco@lemmy.world
                link
                fedilink
                arrow-up
                4
                arrow-down
                2
                ·
                edit-2
                1 month ago

                Im not saying you are defending them, youre just missing a lot of stuff that happened around ai and nvidia in your comment and whatever genai we have now isnt all because of nvidia. That its locked to nvidia is because of what they did before ai hit public eyes.
                Current genAI also has not much to do with nvidia besides programs being based on cuda which uses nvidia’s tensor cores for neural processing. From a technical standpoint, nothing ai has to do with nvidia, they just played smart ( and unfair ).

              • jj4211@lemmy.world
                link
                fedilink
                arrow-up
                1
                ·
                1 month ago

                Yeah, but a tiny sliver of their valuation is attributable to the durable and real value of “AI” approaches.

          • RogueBanana@lemmy.zip
            link
            fedilink
            English
            arrow-up
            3
            ·
            1 month ago

            But the insane growth is because of hype. Doesn’t mean it’s useless or makes it invalid, but they would nowhere be this big if it wasn’t for the AI gold rush going on with all of their data centre cards being sold out immediately despite 50x profit margins and such.

      • marcos@lemmy.world
        link
        fedilink
        arrow-up
        5
        arrow-down
        3
        ·
        1 month ago

        Their products are currently better than the competition

        If you want just a GPU, no they aren’t really better.

        They have some different strengths, so they may fit some use-cases better, but they aren’t out-right better.

    • AwkwardLookMonkeyPuppet@lemmy.world
      link
      fedilink
      English
      arrow-up
      25
      arrow-down
      3
      ·
      1 month ago

      Not as weird as reddit. Go look at the front page these days. There are posts like “what’s your name?” With thousands of upvotes. Completely worthless drivel is getting driven to the front page.

    • emeralddawn45@discuss.tchncs.de
      link
      fedilink
      arrow-up
      3
      ·
      1 month ago

      On lemmy.world i would fully believe anyone who said there are 60+ bots boosting any given post (obviously not every post). Thats not a lot when you think about it but enough to give certain posts traction and ensure they stay at the top for days.

    • x_pikl_x@lemmy.world
      link
      fedilink
      arrow-up
      5
      arrow-down
      4
      ·
      1 month ago

      Definitely losing interest fast in this “reddit alternative”. The politics subs are just as bad, if not worse, than Facebook and Twitter. The communities are just clones of subreddits, after the 3rd party / mod purge. The dumbass comment chains that stopped being funny 5 years ago. Clearly bots being used to influence social issues… Could go on but it’s just more wasted bandwidth.

      • GHiLA@sh.itjust.works
        link
        fedilink
        arrow-up
        1
        ·
        1 month ago

        I just see Lemmy as “more Reddit”

        I read Reddit, I read Lemmy.

        Lemmy usually has 60-80 active commenters on any post that makes it to the front few pages of the entire network, which is more than enough cannon fodder for general discourse and discussion.

        I don’t care if two or three ass-hats are using bots. They’re obvious and easy to spot, and everyone downvotes them anyway, and if they don’t, who cares? Move on. This site isn’t governed by any one person or interest, it’s going to be influenced by everyone equally, and that includes cheap shit like bots, because they’re wrapped up in everyone just the same as everybody else.

    • TheGrandNagus@lemmy.world
      link
      fedilink
      English
      arrow-up
      37
      arrow-down
      2
      ·
      edit-2
      1 month ago

      People buy Nvidia no matter what. Even when they aren’t the best choice. Then those same people complain about Nvidia doing the anticompetitive things they do.

      The best is when people cheer for AMD making something great, only so they can buy an Nvidia card cheaper, as if the only reason AMD exists is to subsidise their Nvidia purchase!

      Nvidia’s greatest asset is the mindshare they have.

      • 9point6@lemmy.world
        link
        fedilink
        arrow-up
        31
        ·
        edit-2
        1 month ago

        Well that and CUDA still means a load of professionals in various fields are stuck using Nvidia whether they like it or not. This means data centers are incentivised to go with Nvidia if they want those customers, which ultimately means if someone gonna work on code/tools that run in those data centers, you want the same architecture on your local machine for development and testing.

        It’s getting better, but the gap is still real. Hopefully the guys that are working on SCALE can actually get it working on the CDNA GPUs one day, since data centers are where a lot of the CUDA is running or perhaps the UDNA stuff AMD just announced will enable this.

        The fact this is all hinging on the third party that develops SCALE, should highlight that AMD still doesn’t seem to be playing the same game as Nvidia, which is why we’re still in this position.

        • TheGrandNagus@lemmy.world
          link
          fedilink
          English
          arrow-up
          17
          ·
          edit-2
          1 month ago

          Definitely. CUDA has had a long headstart, and Nvidia were very clever in getting it entrenched early on, particularly in universities and such. It’s also just… generally does the job.

          My above comment was purely on the gaming side

      • atro_city@fedia.io
        link
        fedilink
        arrow-up
        12
        ·
        1 month ago

        100%

        “I want change!”

        *Doesn’t do anything to change*

        “Why hasn’t anything changed?”

      • Lasherz12@lemmy.world
        link
        fedilink
        arrow-up
        2
        arrow-down
        2
        ·
        1 month ago

        I would have much preferred giving AMD money instead, but at their best the lack of DLSS performance was meaningful when everyone thought Cyberpunk was the new standard of graphical fidelity with the 6000/3000 series.

    • NeilBrü@lemmy.world
      link
      fedilink
      arrow-up
      2
      ·
      edit-2
      26 days ago

      The linear algebraic computations performed on their GPU’s tensor cores (since the Turing era) combined with their CUDA and cuDNN software stack have the fastest performance in training deep neural network algorithms.

      That may not last forever, but it’s the best in terms of dollars per TOPS an average DNN developer like myself has access to currently.

      • Hellmo_luciferrari@lemm.ee
        link
        fedilink
        English
        arrow-up
        9
        arrow-down
        7
        ·
        1 month ago

        Not OP, but he’s a billionaire. There is no such thing as an ethical billionaire. No amount of work they have done would earn billions.

              • Hellmo_luciferrari@lemm.ee
                link
                fedilink
                English
                arrow-up
                1
                ·
                1 month ago

                I didn’t say I hate billionaires. They shouldn’t exist != I hate them. You conflate two opinions that could agree as the same thing.

                My stance on billionaires will not change because you thought you hit me with a “gotcha” moment.

                Billionaires should not exist because no person can get to be a billionaire without stepping on others to get there. Billionaires aren’t inherently worth more than anyone else.

                If billionaires exist at the same time of people suffering (homelessness, medical debt and inability to pay for treatment, starvation, etc and the list goes on) then they shouldn’t exist. That amount of wealth is for sociopaths, that amount of wealth is amassed by those lacking empathy, and moral value.

                So take your argumentative behavior elsewhere.

                Go back to reddit ya troll.

  • Zip2@feddit.uk
    link
    fedilink
    arrow-up
    5
    arrow-down
    6
    ·
    edit-2
    1 month ago

    And Linux users

    So he’s not all bad, and let’s be honest it doesn’t take much. Did he suggest using light mode, or tabs instead of spaces?

    • redhorsejacket@lemmy.world
      link
      fedilink
      arrow-up
      2
      arrow-down
      1
      ·
      1 month ago

      If those kids weren’t busy troubleshooting driver issues, they’d be very upset.

      i don’t know a damn thing about linux, but that seems like something that would happen

      • GreenKnight23@lemmy.world
        link
        fedilink
        arrow-up
        1
        arrow-down
        1
        ·
        1 month ago

        as a daily linux user, this is pretty funny.

        especially when I’m also a daily Windows user who fights against Windows update downgrading drivers.