There were a number of exciting announcements from Apple at WWDC 2024, from macOS Sequoia to Apple Intelligence. However, a subtle addition to Xcode 16 — the development environment for Apple platforms, like iOS and macOS — is a feature called Predictive Code Completion. Unfortunately, if you bought into Apple’s claim that 8GB of unified memory was enough for base-model Apple silicon Macs, you won’t be able to use it. There’s a memory requirement for Predictive Code Completion in Xcode 16, and it’s the closest thing we’ll get from Apple to an admission that 8GB of memory isn’t really enough for a new Mac in 2024.

  • rottingleaf@lemmy.zip
    link
    fedilink
    English
    arrow-up
    23
    arrow-down
    2
    ·
    6 months ago

    I still can’t fully accept that 1GB is not normal, 2GB is not very good, and 4GB is not all you ever gonna need.

    If only it got bloated for some good reasons.

    • Honytawk@lemmy.zip
      link
      fedilink
      English
      arrow-up
      2
      arrow-down
      1
      ·
      6 months ago

      The moment you use a file that is bigger than 1GB, that computer will explode.

      Some of us do more than just browse Lemmy.

      • rottingleaf@lemmy.zip
        link
        fedilink
        English
        arrow-up
        3
        arrow-down
        1
        ·
        6 months ago

        Wow. Have you ever considered how people were working with files bigger than total RAM they had in the normal days of computing?

        So in your opinion if you have 2GB+ of a log file, editing it you should have 2GB RAM occupied?

        I just have no words, the ignorance.

    • Aux@lemmy.world
      link
      fedilink
      English
      arrow-up
      5
      arrow-down
      13
      ·
      6 months ago

      High quality content is the reason. Sit in a terminal and your memory usage will be low.

      • lastweakness@lemmy.world
        link
        fedilink
        English
        arrow-up
        16
        arrow-down
        3
        ·
        6 months ago

        So we’re just going to ignore stuff like Electron, unoptimized assets, etc… Basically every other known problem… Yeah let’s just ignore all that

        • Aux@lemmy.world
          link
          fedilink
          English
          arrow-up
          3
          arrow-down
          8
          ·
          6 months ago

          Is Electron that bad? Really? I have Slack open right now with two servers and it takes around 350MB of RAM. Not that bad, considering that every other colleague thinks that posting dumb shit GIFs into work chats is cool. That’s definitely nowhere close to Firefox, Chrome and WebStorm eating multiple gigs each.

          • lastweakness@lemmy.world
            link
            fedilink
            English
            arrow-up
            14
            arrow-down
            2
            ·
            6 months ago

            Yes, it really is that bad. 350 MBs of RAM for something that could otherwise have taken less than 100? That isn’t bad to you? And also, it’s not just RAM. It’s every resource, including CPU, which is especially bad with Electron.

            I don’t really mind Electron myself because I have enough resources. But pretending the lack of optimization isn’t a real problem is just not right.

            • Aux@lemmy.world
              link
              fedilink
              English
              arrow-up
              2
              arrow-down
              7
              ·
              6 months ago

              First of all, 350MB is a drop in a bucket. But what’s more important is performance, because it affects things like power consumption, carbon emissions, etc. I’d rather see Slack “eating” one gig of RAM and running smoothly on a single E core below boost clocks with pretty much zero CPU use. That’s the whole point of having fast memory - so you can cache and pre-render as much as possible and leave it rest statically in memory.

              • lastweakness@lemmy.world
                link
                fedilink
                English
                arrow-up
                6
                ·
                6 months ago

                CPU usage is famously terrible with Electron, which i also pointed out in the comment you’re replying to. But yes, having multiple chromium instances running for each “app” is terrible

                  • Shadywack@lemmy.world
                    link
                    fedilink
                    English
                    arrow-up
                    2
                    arrow-down
                    2
                    ·
                    6 months ago

                    Yes it is.

                    “iT’S oNLy a FeW hUnDrED MB oF LiBRAriES and BiNAriES pEr aPp, iT’S oNLy dOuBLe oR tRiPLe tHe RAM, DiSk, anD cpU uSAgE”

                    Then we have the fucking shit show of 6-8GB of RAM used just by booting the fucking machine. Chromium/Webkit is practically an OS by itself for all the I/O, media handling, and built in libraries upon libraries of shit. Let’s run that whole entire stack for all these electron apps, and then fragment each one independent of each other (hello Discord, who used Electron 12 for WAY too long) then say “bUt iT’s pORtaBLe!”.

                    Yes, it isn’t just terrible, it’s fucking obnoxiously and horrendously terrible, like we grabbed defeat from the jaws of victory terrible, and moronically insipid. Optimization in the fucking trash can and a fire hydrant in all our fucking assholes, terrible. That’s HOW terrible it actually is, so you’re wrong.

              • jas0n@lemmy.world
                link
                fedilink
                English
                arrow-up
                2
                arrow-down
                1
                ·
                edit-2
                6 months ago

                Just wanted to point out that the number 1 performance blocker in the CPU is memory. In the general case, if you’re wasting memory, you’re wasting CPU. These two things really cannot be talked about in isolation.

                • Aux@lemmy.world
                  link
                  fedilink
                  English
                  arrow-up
                  1
                  ·
                  6 months ago

                  No, that’s the other way round. You either have high CPU load and low memory, or low CPU load and high memory.

                  • jas0n@lemmy.world
                    link
                    fedilink
                    English
                    arrow-up
                    2
                    ·
                    6 months ago

                    I’m not sure what metric you’re using to determine this. The bottom line is, if you’re trying to get the CPU to really fly, using memory efficiently is just as important (if not more) than the actual instructions you send to it. The reason for this is the high latency required to go out to external memory. This is performance 101.

              • Verat@sh.itjust.works
                link
                fedilink
                English
                arrow-up
                1
                ·
                edit-2
                6 months ago

                When (according to about:unloads) my average firefox tab is 70-230MB depending on what it is and how old the tab is (youtube tabs for example bloat up the longer they are open), a chat app using over 350 is a pretty big deal

                just checked, my firefox is using 4.5gb of RAM, while telegram is using 2.3, while minimized to the system tray, granted Telegram doesnt use electron, but this is a trend across lots of programs and Electron is a big enough offender I avoid apps using it. When I get off shift I can launch discord and check it too, but it is usually bad enough I close it entirely when not in use

          • Jakeroxs@sh.itjust.works
            link
            fedilink
            English
            arrow-up
            1
            arrow-down
            1
            ·
            6 months ago

            What’s wrong with using Gifs in work chat lmao, can laugh or smile while hating your job like the rest of us.

      • rottingleaf@lemmy.zip
        link
        fedilink
        English
        arrow-up
        9
        arrow-down
        3
        ·
        6 months ago

        256MB or 512MB was fine for high-quality content in 2002, what was that then.

        Suppose the amount of pixels and everything quadrupled - OK, then 2GB it is.

        But 4GB being not enough? Do you realize what 4GB is?

        • lastweakness@lemmy.world
          link
          fedilink
          English
          arrow-up
          7
          ·
          6 months ago

          They didn’t just quadruple. They’re orders of magnitude higher these days. So content is a real thing.

          But that’s not what’s actually being discussed here, memory usage these days is much more of a problem caused by bad practices rather than just content.

          • rottingleaf@lemmy.zip
            link
            fedilink
            English
            arrow-up
            1
            arrow-down
            2
            ·
            6 months ago

            I know. BTW, if something is done in an order of magnitude less efficient way than it could and it did, one might consider it a result of intentional policy aimed at neutering development. Just not clear whose. There are fewer corporations affecting this than big governments, and those are capable of reaching consensus from time to time. So not a conspiracy theory.

        • Aux@lemmy.world
          link
          fedilink
          English
          arrow-up
          4
          arrow-down
          3
          ·
          6 months ago

          One frame for a 4K monitor takes 33MB of memory. You need three of them for triple buffering used back in 2002, so half of your 256MB went to simply displaying a bloody UI. But there’s more! Today we’re using viewport composition, so the more apps you run, the more memory you need just to display the UI. Now this is what OS will use to render the final result, but your app will use additional memory for high res icons, fonts, photos, videos, etc. 4GB today is nothing.

          I can tell you an anecdote. My partner was making a set of photo collages, about 7 art works to be printed in large format (think 5m+ per side). So 7 photo collages with source material saved on an external drive took 500 gigs. Tell me more about 256MB, lol.

          • rottingleaf@lemmy.zip
            link
            fedilink
            English
            arrow-up
            2
            arrow-down
            4
            ·
            edit-2
            6 months ago

            Yes, you wouldn’t have 4K in 2002.

            4GB today is nothing.

            My normal usage would be kinda strained with it, but possible.

            $ free -h
                           total        used        free      shared  buff/cache   available
            Mem:            17Gi       3,1Gi        11Gi       322Mi       3,0Gi        14Gi
            Swap:          2,0Gi          0B       2,0Gi
            $