• shalol@alien.topB
      link
      fedilink
      English
      arrow-up
      1
      ·
      1 year ago

      Yeah their pricing is already quite reflective of the investors ambitions of selling the pickaxes to the gold rush. IMO, it’s never about the success but about the potential behind a company.

    • wakIII@alien.topB
      link
      fedilink
      English
      arrow-up
      1
      ·
      1 year ago

      Hyperscalers don’t pay close to MSRP, expect more like $5-8k per chip

  • Shoshke@alien.topB
    link
    fedilink
    English
    arrow-up
    1
    ·
    1 year ago

    And this is a part of the reason why Nvidia can price their consumer GPU’s like they do.

    They literally are full throttle in the B2B and anything that might’ve been dedicated to the retail market just frees up production for the AI business that is absolutely ridiculously profitable for them ATM.

    • XYHopGuy@alien.topB
      link
      fedilink
      English
      arrow-up
      1
      ·
      1 year ago

      data center products are bottlenecked by chip on wafer on substrate packaging as they use chiplets.

      Gaming GPUs do not- they have pricing power for consumer GPUs because they are a generation ahead of competitors.

  • skyflex@alien.topB
    link
    fedilink
    English
    arrow-up
    1
    ·
    1 year ago

    I got to enable a special Instance tier for one of our engineering teams in AWS the other day. They come with 6 or 7 of these GPUs. Had to coordinate with our TAMs because they’re basically bare metal hosts and cost so so much - he told me even internal AWS folks aren’t allowed to play with them because of the cost and demand. Crazy.

  • watduhdamhell@alien.topB
    link
    fedilink
    English
    arrow-up
    1
    ·
    1 year ago

    AMD and their MI300 should catch a similar traction, and propel AMD to much greater heights, market cap wise. Now might be an excellent time to buy…

    I mean, who else makes this level of AI accelerator? Nobody. Nobody but AMD and Nvidia can do this right now. Seems to me they are both going to be much, much larger companies in the next 10 years than anyone thought they might be.

    • Sirisian@alien.topB
      link
      fedilink
      English
      arrow-up
      0
      ·
      1 year ago

      I mean, who else makes this level of AI accelerator?

      Google, Microsoft, and Amazon are making AI accelerators for their datacenters. Some are for training and others are for inference (running the trained models for services).

      I’d be somewhat hesitant until the full benchmarks are out. AMD’s higher memory I think it was sounded neat, but that could be very fleeting in terms of advantages.

      • watduhdamhell@alien.topB
        link
        fedilink
        English
        arrow-up
        1
        ·
        1 year ago

        None of those companies are real chip makers and, while they may be able to produce a nice, custom solution for their applications, will never compete with the actual chip makers, and AMD and Nvidia will continue to supply a greater and greater share of the vast majority of AI super chips, eventually eliminating most custom solutions.

  • Acceptable-Truck3803@alien.topB
    link
    fedilink
    English
    arrow-up
    1
    ·
    1 year ago

    Yeah I’m in this industry and to get a H100/A100 card you are looking around 40-60 weeks or at least 4 months. It’s not Meta/Facebook alone that use these cards…