There were a number of exciting announcements from Apple at WWDC 2024, from macOS Sequoia to Apple Intelligence. However, a subtle addition to Xcode 16 — the development environment for Apple platforms, like iOS and macOS — is a feature called Predictive Code Completion. Unfortunately, if you bought into Apple’s claim that 8GB of unified memory was enough for base-model Apple silicon Macs, you won’t be able to use it. There’s a memory requirement for Predictive Code Completion in Xcode 16, and it’s the closest thing we’ll get from Apple to an admission that 8GB of memory isn’t really enough for a new Mac in 2024.

  • _number8_@lemmy.world
    link
    fedilink
    English
    arrow-up
    91
    arrow-down
    7
    ·
    4 days ago

    imagine showing this post to someone in 1995

    shit has gotten too bloated these days. i mean even in my head 8GB still sounds like ‘a lot’ of RAM and 16GB feels extravagant

    • rottingleaf@lemmy.zip
      link
      fedilink
      English
      arrow-up
      25
      arrow-down
      2
      ·
      4 days ago

      I still can’t fully accept that 1GB is not normal, 2GB is not very good, and 4GB is not all you ever gonna need.

      If only it got bloated for some good reasons.

      • DefederateLemmyMl@feddit.nl
        link
        fedilink
        English
        arrow-up
        3
        ·
        edit-2
        2 days ago

        I remember when I got my first computer with 1GB of RAM, where my previous computer had 64MB, later upgraded to 192MB. And there were only like 3 or 4 years in between them.

        It was like: holy shit, now I can put all the things in RAM. I will never run out.

      • Honytawk@lemmy.zip
        link
        fedilink
        English
        arrow-up
        2
        arrow-down
        1
        ·
        2 days ago

        The moment you use a file that is bigger than 1GB, that computer will explode.

        Some of us do more than just browse Lemmy.

        • rottingleaf@lemmy.zip
          link
          fedilink
          English
          arrow-up
          3
          arrow-down
          1
          ·
          2 days ago

          Wow. Have you ever considered how people were working with files bigger than total RAM they had in the normal days of computing?

          So in your opinion if you have 2GB+ of a log file, editing it you should have 2GB RAM occupied?

          I just have no words, the ignorance.

      • Aux@lemmy.world
        link
        fedilink
        English
        arrow-up
        5
        arrow-down
        15
        ·
        3 days ago

        High quality content is the reason. Sit in a terminal and your memory usage will be low.

        • lastweakness@lemmy.world
          link
          fedilink
          English
          arrow-up
          17
          arrow-down
          3
          ·
          3 days ago

          So we’re just going to ignore stuff like Electron, unoptimized assets, etc… Basically every other known problem… Yeah let’s just ignore all that

          • Aux@lemmy.world
            link
            fedilink
            English
            arrow-up
            3
            arrow-down
            9
            ·
            3 days ago

            Is Electron that bad? Really? I have Slack open right now with two servers and it takes around 350MB of RAM. Not that bad, considering that every other colleague thinks that posting dumb shit GIFs into work chats is cool. That’s definitely nowhere close to Firefox, Chrome and WebStorm eating multiple gigs each.

            • lastweakness@lemmy.world
              link
              fedilink
              English
              arrow-up
              16
              arrow-down
              2
              ·
              3 days ago

              Yes, it really is that bad. 350 MBs of RAM for something that could otherwise have taken less than 100? That isn’t bad to you? And also, it’s not just RAM. It’s every resource, including CPU, which is especially bad with Electron.

              I don’t really mind Electron myself because I have enough resources. But pretending the lack of optimization isn’t a real problem is just not right.

              • Aux@lemmy.world
                link
                fedilink
                English
                arrow-up
                2
                arrow-down
                8
                ·
                3 days ago

                First of all, 350MB is a drop in a bucket. But what’s more important is performance, because it affects things like power consumption, carbon emissions, etc. I’d rather see Slack “eating” one gig of RAM and running smoothly on a single E core below boost clocks with pretty much zero CPU use. That’s the whole point of having fast memory - so you can cache and pre-render as much as possible and leave it rest statically in memory.

                • jas0n@lemmy.world
                  link
                  fedilink
                  English
                  arrow-up
                  2
                  arrow-down
                  1
                  ·
                  edit-2
                  2 days ago

                  Just wanted to point out that the number 1 performance blocker in the CPU is memory. In the general case, if you’re wasting memory, you’re wasting CPU. These two things really cannot be talked about in isolation.

                  • Aux@lemmy.world
                    link
                    fedilink
                    English
                    arrow-up
                    1
                    ·
                    2 days ago

                    No, that’s the other way round. You either have high CPU load and low memory, or low CPU load and high memory.

                • lastweakness@lemmy.world
                  link
                  fedilink
                  English
                  arrow-up
                  6
                  ·
                  3 days ago

                  CPU usage is famously terrible with Electron, which i also pointed out in the comment you’re replying to. But yes, having multiple chromium instances running for each “app” is terrible

                • nossaquesapao@lemmy.eco.br
                  link
                  fedilink
                  English
                  arrow-up
                  3
                  arrow-down
                  1
                  ·
                  3 days ago

                  First of all, 350MB is a drop in a bucket

                  People don’t run just a single app in their machines. If we triple ram usage of several apps, it results in a massive increase. That’s how bloat happens, it’s a cumulative increase on everything. If we analyze single cases, we could say that they’re not that bad individually, but the end result is the necessity for a constant and fast increase in hardware resources.

                  • Aux@lemmy.world
                    link
                    fedilink
                    English
                    arrow-up
                    1
                    ·
                    3 days ago

                    People don’t run just a single app in their machines

                    That’s not bloat, that’s people running more apps than ever.

                    the end result is the necessity for a constant and fast increase in hardware resources.

                    That’s not true. 8 to 16GB RAM machines became common in early 2010-s and barely anyone is using 32 gigs today. Even if we look at the most recent Steam Hardware & Software Survey, we will see that even gamers are pretty much stuck with 16 gigs. 32 gigs are installed on less than 30% of machines and more than that is barely 4%. Ten years ago 8 gigs was the most common option with 12+ gigs (Steam didn’t have 16gig category in 2014) being the third option. The switch to 16 gigs being number one happened in December 2019, so we’re five years in with 16 gigs being the most common option and more RAM is not getting anywhere close to replacing it (47.08% for 16 gigs and 28.72% for 32 gigs as of May 2024).

                    Now if you look at late 90-s and 2000-s you will see that RAM was doubling pretty much every 2-3 years. We can look at Steam data once again. Back in 2008 (that’s the earliest data available on archive.org) 2 gigs were the most common option. Next year 3 gigs option got very close and sat at 2nd place. In 2010 2GB, 3GB and 4GB were splitting hairs. 4GB option became the most common in 2011 with 3GB variant being very close 2nd place. 5GB option became the king in 2012. And the very next year 8 gigs became the norm.

                    So, 2 gigs in 2008, 4 gigs in 2011 and 8 gigs in 2013. You can check historical data yourself here https://web.archive.org/web/20130915000000*/http://store.steampowered.com/hwsurvey/

                • Verat@sh.itjust.works
                  link
                  fedilink
                  English
                  arrow-up
                  1
                  ·
                  edit-2
                  3 days ago

                  When (according to about:unloads) my average firefox tab is 70-230MB depending on what it is and how old the tab is (youtube tabs for example bloat up the longer they are open), a chat app using over 350 is a pretty big deal

                  just checked, my firefox is using 4.5gb of RAM, while telegram is using 2.3, while minimized to the system tray, granted Telegram doesnt use electron, but this is a trend across lots of programs and Electron is a big enough offender I avoid apps using it. When I get off shift I can launch discord and check it too, but it is usually bad enough I close it entirely when not in use

                  • Aux@lemmy.world
                    link
                    fedilink
                    English
                    arrow-up
                    1
                    arrow-down
                    1
                    ·
                    3 days ago

                    Telegram is using only 66 megs here. Again - it’s about content.

            • nossaquesapao@lemmy.eco.br
              link
              fedilink
              English
              arrow-up
              1
              ·
              3 days ago

              It sure is. I’m running ferdium at this very moment with 3 chat apps open, and it consumes almost a gigabyte for something that could take just a few megabytes.

            • Jakeroxs@sh.itjust.works
              link
              fedilink
              English
              arrow-up
              1
              arrow-down
              1
              ·
              3 days ago

              What’s wrong with using Gifs in work chat lmao, can laugh or smile while hating your job like the rest of us.

        • rottingleaf@lemmy.zip
          link
          fedilink
          English
          arrow-up
          9
          arrow-down
          3
          ·
          3 days ago

          256MB or 512MB was fine for high-quality content in 2002, what was that then.

          Suppose the amount of pixels and everything quadrupled - OK, then 2GB it is.

          But 4GB being not enough? Do you realize what 4GB is?

          • lastweakness@lemmy.world
            link
            fedilink
            English
            arrow-up
            8
            ·
            3 days ago

            They didn’t just quadruple. They’re orders of magnitude higher these days. So content is a real thing.

            But that’s not what’s actually being discussed here, memory usage these days is much more of a problem caused by bad practices rather than just content.

            • rottingleaf@lemmy.zip
              link
              fedilink
              English
              arrow-up
              1
              arrow-down
              2
              ·
              3 days ago

              I know. BTW, if something is done in an order of magnitude less efficient way than it could and it did, one might consider it a result of intentional policy aimed at neutering development. Just not clear whose. There are fewer corporations affecting this than big governments, and those are capable of reaching consensus from time to time. So not a conspiracy theory.

          • Aux@lemmy.world
            link
            fedilink
            English
            arrow-up
            4
            arrow-down
            3
            ·
            3 days ago

            One frame for a 4K monitor takes 33MB of memory. You need three of them for triple buffering used back in 2002, so half of your 256MB went to simply displaying a bloody UI. But there’s more! Today we’re using viewport composition, so the more apps you run, the more memory you need just to display the UI. Now this is what OS will use to render the final result, but your app will use additional memory for high res icons, fonts, photos, videos, etc. 4GB today is nothing.

            I can tell you an anecdote. My partner was making a set of photo collages, about 7 art works to be printed in large format (think 5m+ per side). So 7 photo collages with source material saved on an external drive took 500 gigs. Tell me more about 256MB, lol.

            • rottingleaf@lemmy.zip
              link
              fedilink
              English
              arrow-up
              2
              arrow-down
              4
              ·
              edit-2
              3 days ago

              Yes, you wouldn’t have 4K in 2002.

              4GB today is nothing.

              My normal usage would be kinda strained with it, but possible.

              $ free -h
                             total        used        free      shared  buff/cache   available
              Mem:            17Gi       3,1Gi        11Gi       322Mi       3,0Gi        14Gi
              Swap:          2,0Gi          0B       2,0Gi
              $ 
              
    • mycodesucks@lemmy.world
      link
      fedilink
      English
      arrow-up
      7
      ·
      3 days ago

      Absolutely.

      Bad, rushed software that wires together 200 different giant libraries just to use a fraction of them and then run it in a sandboxed container with three daemons it needs for some reason doesn’t mean “8 Gb isn’t enough”, it means write tighter, better software.

      • cheesepotatoes@lemmy.world
        link
        fedilink
        English
        arrow-up
        6
        ·
        3 days ago

        That ship has long sailed unfortunately. The industry gave up on optimization in favour of praying that hardware advancements can keep up with the bloat.

    • yeehaw@lemmy.ca
      link
      fedilink
      English
      arrow-up
      19
      arrow-down
      3
      ·
      4 days ago

      I chalk it up to lazy rushed development. Good code is art.

      • Aux@lemmy.world
        link
        fedilink
        English
        arrow-up
        4
        arrow-down
        6
        ·
        3 days ago

        That’s not true at all. The code doesn’t take much space. The content does. Your high quality high res photos, 4K HDR videos, lossless 96kHz audio, etc.

        • yeehaw@lemmy.ca
          link
          fedilink
          English
          arrow-up
          3
          ·
          edit-2
          3 days ago

          But there are lots of shortcuts now. Asset packs and coding environments that come bundled with all kinds of things you don’t need. People import packages that consume a lot of space to use one tiny piece of it.

          To be clear, I’m not talking about videos and images. You’d have these either way.

          • Aux@lemmy.world
            link
            fedilink
            English
            arrow-up
            2
            arrow-down
            1
            ·
            3 days ago

            All these packages don’t take much memory. Also tree shaking is a thing. For example, one of the projects I currently work on has over 5 gigs of dependencies, but once I compile it for production, the whole code based is mere 3 megs and that’s including inlined styles and icons. The code itself is pretty much non-existent.

            On the other hand I have 100KB of text translations just for the English language alone. Because there’s shit loads of text. And over 100MB of images, which are part of the build. And then there’s a remote storage with gigabytes of documents.

            Even if I double the code base by copy pasting it will be a drop in a bucket.

    • Bjornir@programming.dev
      link
      fedilink
      English
      arrow-up
      13
      ·
      4 days ago

      I have a VPS that uses 1GB of RAM, it has 6-7 apps running in docker containers which isn’t the most ram efficient method of running apps.

      A light OS really helps, plus the most used app that uses a lot of RAM actually reduce their consumption if needed, but use more when memory is free, the web browser. On one computer I have chrome running with some hundreds of MB used, instead of the usual GBs because RAM is running out.

      So it appears that memory is full,but you can actually have a bit more memory available that is “hidden”

      • derpgon@programming.dev
        link
        fedilink
        English
        arrow-up
        10
        ·
        4 days ago

        Same here. When idle, the apps basically consume nothing. If they are just a webserver that calls to some PHP script, it basically takes no RAM at all when idle, and some RAM when actually used.

        Websites and phone apps are such an unoptimized pieces if garbage that they are the sole reason for high RAM requirements. Also lots of background bloatware.

      • Specal@lemmy.world
        link
        fedilink
        English
        arrow-up
        7
        arrow-down
        1
        ·
        4 days ago

        This is resource reservation, it happens at an OS level. If chrome is using what appears to be alot of ram, it will be freed up once either the OS or another application requires it.

        It just exists so that an application knows that if it needs that resource it can use X amount for now.

    • stoly@lemmy.world
      link
      fedilink
      English
      arrow-up
      6
      ·
      3 days ago

      You just have to watch your favorite tablet get slower year after year to understand that a lot of this is artificial. They could make applications that don’t need those resources but would never do so.

    • jas0n@lemmy.world
      link
      fedilink
      English
      arrow-up
      4
      ·
      edit-2
      3 days ago

      Guy from '95: “I bet it’s lightning fast though…”

      No dude. It peaks pretty soon. In my time, Microsoft is touting a chat program that starts in under 10 seconds. And they’re genuinely proud of it.

    • qqq@programming.dev
      link
      fedilink
      English
      arrow-up
      2
      ·
      edit-2
      3 days ago

      I once went for lower CAS timing 2x 128MB ram sticks (256 MB) instead of 2x 256s with slower speeds because I thought 512MB was insane overkill. Realized how wrong I was when trying to play Star Wars galaxies mmorpg when a lot of people were on the screen it started swapping to disk. Look up the specs for an IBM Aptiva, first computer my parents bought, and you’ll understand how 512MB can seem like a lot.

      Now my current computer has 64 GB (most gaming computers go for 32GB) at the time I built it. My workstation at work has 128GB which really isn’t even enough for some workloads we have that use a lot of in-memory cache… And large servers can have multiple TB of RAM. My mind has been blown multiple times.

    • Shadywack@lemmy.world
      link
      fedilink
      English
      arrow-up
      4
      ·
      3 days ago

      We measure success by how many GB’s we have consumed when the only keys depressed from power on to desktop is our password. This shit right here is the real issue.

    • Aux@lemmy.world
      link
      fedilink
      English
      arrow-up
      6
      arrow-down
      6
      ·
      3 days ago

      You can always switch to a text based terminal and free up your memory. Just don’t compain that YouTube doesn’t play 4K videos anymore.

        • Aux@lemmy.world
          link
          fedilink
          English
          arrow-up
          1
          arrow-down
          1
          ·
          2 days ago

          MPV doesn’t work in terminal (well, technically it does, but what’s the point of 4K HDR video in ASCII mode?). Please don’t confuse terminal emulator in GUI mode with a real text mode terminal.

          • DefederateLemmyMl@feddit.nl
            link
            fedilink
            English
            arrow-up
            3
            arrow-down
            1
            ·
            edit-2
            2 days ago

            The point is that your example use case of “YouTube 4k videos” doesn’t need a browser full of bloated js garbage.

              • DefederateLemmyMl@feddit.nl
                link
                fedilink
                English
                arrow-up
                2
                arrow-down
                1
                ·
                edit-2
                2 days ago

                Actually lot less than the browser. Under 300MB, I just checked, and that’s mostly just the network buffer which is 150MB by default.

                • Aux@lemmy.world
                  link
                  fedilink
                  English
                  arrow-up
                  1
                  arrow-down
                  2
                  ·
                  2 days ago

                  That’s about what my Slack is using, while being written in Electron, lol. Oh, you people…

          • uis@lemm.ee
            link
            fedilink
            English
            arrow-up
            1
            ·
            edit-2
            2 days ago

            KMSDRM is in terminal enough for me. Fbcon too.

            EDIT: obviously not dummy terminal over UART or like that.