• disguy_ovahea@lemmy.world
    link
    fedilink
    English
    arrow-up
    5
    arrow-down
    7
    ·
    edit-2
    7 months ago

    And it’s not RAM, it’s UM for an SoC. The usage of memory changed with the introduction of Apple Silicon.

    • Billiam@lemmy.world
      link
      fedilink
      English
      arrow-up
      5
      arrow-down
      1
      ·
      edit-2
      7 months ago

      “Unified” only means there’s not a discrete block for the CPU and a discrete block for the GPU to use. But it’s still RAM- specifically, LPDDR4x (for M1), LPDDR5 (for M2), or LPDDR5X (for M3).

      Besides, low-end PCs with integrated graphics have been using unified memory for decades- no one ever said “They don’t have RAM, they have UM!”

      • disguy_ovahea@lemmy.world
        link
        fedilink
        English
        arrow-up
        3
        arrow-down
        9
        ·
        edit-2
        7 months ago

        Yes, that’s true, but it’s still an indicator of an uninformed reporter.

        Apple Silicon chips pass data from one dedicated cores directly to another without the need of passing through memory, hence the smaller processor cache. There are between 18 and 58 cores in the M3 (model dependent). The architecture works very differently than the conventional CPU/GPU/RAM model.

        I can run FCP and Logic Pro and have memory to spare with 16GB of UM. The only thing that pushes me into swap is Chrome. lol

        • BearOfaTime@lemm.ee
          link
          fedilink
          English
          arrow-up
          7
          arrow-down
          1
          ·
          7 months ago

          It’s a pointless distinction.

          And in this case, it makes 8gig look even worse.

          • disguy_ovahea@lemmy.world
            link
            fedilink
            English
            arrow-up
            1
            arrow-down
            3
            ·
            edit-2
            7 months ago

            Maybe you’re not familiar with the apps I’m referring to. Final Cut Pro and Logic Pro are professional video and audio workstations.

            If I tried to master an export from Adobe Premiere Pro in Protools on PC I’d need 32GB of RAM to to prevent stutter. I only use ~12GB of 16GB doing the same on Apple Silicon.

            8GB of UM is not for someone running two pro apps at once. It’s for grandma to use for online banking and check her email and Facebook.

        • Billiam@lemmy.world
          link
          fedilink
          English
          arrow-up
          3
          arrow-down
          1
          ·
          7 months ago

          it’s still an indicator of an uninformed reporter.

          My dude, you’re literally in here arguing that because Apple has a blob for both CPU memory and GPU memory that somehow makes that blob “not RAM.” Apple’s design might give fantastic performance, but that’s irrelevant to the fact that the memory on the chip is RAM of known and established standards.

    • BearOfaTime@lemm.ee
      link
      fedilink
      English
      arrow-up
      3
      arrow-down
      1
      ·
      7 months ago

      Like has been done on laptops with on-board video cards since, well, forever?

    • n3m37h@lemmy.dbzer0.com
      link
      fedilink
      English
      arrow-up
      3
      arrow-down
      3
      ·
      7 months ago

      Dude it is just GDDR#, the same stuff consoles use PC’s have had this ability for over a decade there mate apple is just good at marketing.
      What’s next? When VRAM overflows it gets dumped into regular ram? Oh wait PC’s can do that too…

      • disguy_ovahea@lemmy.world
        link
        fedilink
        English
        arrow-up
        1
        arrow-down
        1
        ·
        7 months ago

        With independent CPU and GPU, sure. There’s no SoC that performs anywhere near Apple Silicon.

        • n3m37h@lemmy.dbzer0.com
          link
          fedilink
          English
          arrow-up
          2
          arrow-down
          1
          ·
          7 months ago

          According to benchmarks the 8700G vs M3 is on average 22% slower single core, and is 31% faster multicore, FP32 is 41% higher than the M3 and AI is 54% slower 8700G also uses 54% more energy

          What about those stats says AMD can’t compete? 8700G is a APU just as is the M3

          • disguy_ovahea@lemmy.world
            link
            fedilink
            English
            arrow-up
            1
            arrow-down
            1
            ·
            7 months ago

            I’m talking about practical use performance. I understand your world, you don’t understand mine. I’ve been taking apart and upgrading PCs since the 286. I understand benchmarks. What you don’t understand, is how MacOS uses the SoC in a way where benchmarks =/= real-world performance. I’ve used pro apps on powerful PCs and powerful Macs, and I’m speaking from experience. We can agree to disagree.

            • n3m37h@lemmy.dbzer0.com
              link
              fedilink
              English
              arrow-up
              1
              ·
              7 months ago

              I grew up with a Tandy 1000 and was always getting yelled at for taking it apart along with just about every PC we owned after than too.

              Benchmarks are indicative of real world performance for most part. If they were useless we wouldn’t use them, kinda like userbenchmark.

              The one benefit apple does have is owning its own ecosystem where they can modify the silicon/OS/Software to work with each other better.

              Does not mean the M3 is the best there is and can’t be touched, that is just misleading

              8700G is gonna stomp the M3 using Maxton’s software suite just as the M3 will stop the 8700G using Apples software suite.

              Then also on-top if that the process node for manufacturing said silicon is different (3nm vs 4nm) that alone allows for a 20% (give or take some) performance difference just like every process node change in the past decade or so

              I’ll take the loss on the experience part as the only apple product I own is an Apple TV 4k, but there are many nuances you’ve obviously glossed over

              Is the M3 a good piece of silicon? Yes Is it the best at EVERYTHING? Of course not Should apple give up because they are not the best? Fuck no

              • disguy_ovahea@lemmy.world
                link
                fedilink
                English
                arrow-up
                1
                ·
                7 months ago

                Man, you’re kinda off the point. This is about how much UM is appropriate for a base model. I’m simply saying the architecture of an SoC utilizes UM as a storage liaison exclusively, since CPU and GPU are cores of the same chip. It simply does not mean the same thing as 8GB of RAM in standard architecture. As a pro app user, 16GB is enough. 8GB is plenty for grandma to check her Facebook and online banking.

                • n3m37h@lemmy.dbzer0.com
                  link
                  fedilink
                  English
                  arrow-up
                  1
                  ·
                  7 months ago

                  There’s no SoC that performs anywhere near Apple Silicon.

                  Am I missing the point really? UM is not a new concept. Specifically look at the PS5/X:SX

                  https://www.pcgamer.com/this-amd-mini-pc-kit-is-likely-made-out-of-b0rked-ps5-chips/

                  Notice the soldered RAM and lack of video card? Kinda like what the M series does.

                  And when all is said and done, 8gb is not nearly enough and apple should be chastised for just like Nvidia when they first decided to make 5 different variations of the 1060 making sure 4 of those variations will become ewaste in a few short years and again with the 3050 6gb vs 3050 8gb

                  • disguy_ovahea@lemmy.world
                    link
                    fedilink
                    English
                    arrow-up
                    1
                    ·
                    edit-2
                    7 months ago

                    They both have have independent CPU and GPU. UM is not used to pass from CPU to GPU on an SoC system, it’s exclusively a storage liaison. Therefore it’s used far less than in non SoC applications.

                    The CPU and GPU are one chip. Learn about Apple Silicon SoC rather than trying to find a comparison. You won’t find one anywhere yet.