Text: Headline: ChatGPT in Trouble: OpenAI may go bankrupt by 2024, AI bot costs company $700,000 every day Subhead: OpenAI spends about $700,000 a day, just to keep ChatGPT going. The cost does not include other AI products like GPT-4 and DALL-E2. Right now, it si pulling through only because of Microsoft’s $10 billion funding.

Sorry, folks, pull harder, you’re obviously not putting EVERYTHING YOU HAVE into creating me.

  • corbin@awful.systems
    link
    fedilink
    English
    arrow-up
    20
    ·
    1 year ago

    Rumor is that GPT-4 is also underpriced; in general, rumors are that OpenAI loses money on all of its products individually. It’s sneerworthy, but I don’t know what it means for the future; few things are more dangerous than a cornered wild startup who is starving and afraid.

    • kuna@awful.systems
      link
      fedilink
      English
      arrow-up
      14
      ·
      1 year ago

      OpenAI loses money on all of its products individually

      Maybe they can make up for it in volume.

  • maol@awful.systems
    link
    fedilink
    English
    arrow-up
    19
    arrow-down
    1
    ·
    1 year ago

    Same business model of every tech startups of the late 20 years - use investor cash to pull through years of making no money until you either crash & burn or eliminate all traditional competition and can start charging customers bullshit prices

    • naevaTheRat@lemmy.dbzer0.com
      link
      fedilink
      English
      arrow-up
      5
      ·
      1 year ago

      Imagine having so much fucking wealth that you can piss away a few million on ridiculous thought bubble companies on the off chance one becomes huge and your staggering wealth can become mind boggling nation destroying wealth. Just… because?

  • self@awful.systems
    link
    fedilink
    English
    arrow-up
    16
    ·
    edit-2
    1 year ago

    from the BI article:

    In an effort to reduce the cost of running generative AI models, Microsoft is developing an AI chip it calls Athena, The Information first reported. The project, which started in 2019, comes years after Microsoft made a $1 billion deal with OpenAI which required OpenAI to run its models exclusively on Microsoft’s Azure cloud servers.

    oh man I can’t wait for this grift tech to strain our global IC manufacturing capability and create chip shortages and eventually a mountain of e-waste when it switches to being hosted on ASIC-powered nodes that can’t be used for anything but this specific grift tech nobody asked for (and of course as long as the grift keeps going, they’ll come up with “better” models that need bigger ASICs, creating more e-waste…)

    what fucking year is it? I’m having deja vu

    e: not to mention, making this shit ASIC-reliant means it’s incredibly easy to gatekeep who’s able to run these models via patents, licensing, and hostile pricing. a lot of the promptfans who think we’re one hardware accelerator away from running these on mobile devices aren’t just ignoring engineering — they’re arguing in the exact opposite direction of the cloud enshittification the industry has actually been pushing towards for years

    • bitofhope@awful.systems
      link
      fedilink
      English
      arrow-up
      7
      ·
      1 year ago

      I just wanna buy computer hardware for a reasonable amount of money, man. Can’t I try a video game from the current decade for once in my life?

      If promptfans love ASICs so much, why not go commit corporate espionage at Broadcom and leak their networking chips’ firmware sources (in Uplink the video game).

      • self@awful.systems
        link
        fedilink
        English
        arrow-up
        5
        ·
        edit-2
        1 year ago

        this being just an ARM server should hopefully limit its potential as e-waste, assuming the cores aren’t absolute dogs for non-LLM workloads? but of course this being attached to AI hype will ensure its price tag is at least 10x what’d typically be considered reasonable for 144 ARM cores on a fast interconnect

        e: some more details on the chip from before they went all-in on AI hype. it’s a nice datacenter chip that’ll host a whole lot of vCPUs, but it’s not architecturally innovative in any way I can see other than number going up in terms of performance

        • Audbol@lemmy.world
          link
          fedilink
          English
          arrow-up
          2
          arrow-down
          4
          ·
          1 year ago

          I dunno, ARM is fairly e waste at the moment. Apple and Samsung just had to pump a bunch of money into ARM because it’s value is deflating so bad today it could affect them directly. ARM is liquidating and trying to sell off as much as they can right now. With Risc-V taking over ARM is just a flash in the pan, even Apple and Samsung themselves are investing in Risc-V, hell even Qualcomm is switching to Risc-v development.

    • jonhendry@awful.systems
      link
      fedilink
      English
      arrow-up
      4
      ·
      1 year ago

      I kinda doubt it’d be a rigid single-purpose ASIC like a miner. AI people like to fiddle with things too often for that I think.

  • swlabr@awful.systems
    link
    fedilink
    English
    arrow-up
    11
    ·
    1 year ago

    Now I’m thinking about how we can defeat roko’s basilisk by watching disney’s Meet The Robinsons (2007)

  • umbrella@lemmy.ml
    link
    fedilink
    English
    arrow-up
    5
    ·
    1 year ago

    They are trying to profit from a system that should be distributed. Of course it won’t work.