Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

> macOS 15 uses ~5GB on startup without any app open

Sort of? Mac very aggressively caches things into RAM. It should be using all of your RAM on startup. That's why they've changed the Activity Monitor to say "memory pressure" instead of something like "memory usage."

I'm typing this on an 8 GB MacBook Air and it works just fine. I've got ChatGPT, VSCode, XCode, Blender, and PrusaSlicer minimized and I'm not feeling any lag. If I open any of them it'll take half a second or so as they're loaded from swap, but when they're not in the foreground they're not using up any memory.

 help



Indeed, as I used to tell my ops colleagues when they pointed to RAM utilization graphs, "we paid for all of that RAM, why aren't we using it?"

"Uneaten food" <CHOMP CHOMP> "is wasted food." <CHOMP CHOMP>

Because OoM errors are oh so fun.

I write algorithms that operate on predictable amounts of data. It's very easy to work out the maximum amount of things we need to have and then allocate it all in fixed size arrays. If you allocate all your memory at startup you can never OOM at runtime. Some containers need over 100GB but like the parent comment said we've already bought the RAM.

I write algorithms that operate on less predictable amounts of data.

If you operate over all of your data every time it's a lot more predictable ;)

Caches are automatically released by the OS when demand for memory increases.

You eventually run out of caches to evict.

That is completely irrelevant to this discussion about using the RAM you’ve paid for.

At that point you can still fall back onto swap on NVME.

Doesn’t Apple use pretty damn quick NVME? I wonder how much of a performance drop it actually is. Certainly not as bad as running a swap file on a 5400 rpm HDD…

Isn't that NVME also very expensive to replace because it's tied to hardware identifiers? If you keep swapping all the time, surely NVME would be the first part to fail

This was heavily debated in the 11.4 timeframe because there was risk that this version of the OS could excessively wear NVME.

https://appleinsider.com/articles/21/06/04/apple-resolves-m1...

The issue was subsequently resolved but the consensus was with modern wear leveling this isn't so much a thing.

I have a 2021 MacBook Pro with the original drive. I use it heavily for development practically every day and just dumped the SMART data.

Model Number: APPLE SSD AP1024R

=== START OF SMART DATA SECTION ===

SMART overall-health self-assessment test result: PASSED

Available Spare: 100%

Available Spare Threshold: 99%

As always, YMMV


How often are ooms caused by lack of ram rather than programming?

> How often are ooms caused by lack of ram rather than programming?

You're right, but in a production deployment, that extra ram might mean the difference between a close call that you patch the next day and an all hands emergency to call in devops and engineers together during peak usage.

source: been there


we're still talking about the MacBook, right?

> we're still talking about the MacBook, right?

na, this is just PTSD talking


I don't think MacOS OoMs as Linux

(and to be honest the way Linux does acts on OoMs are quite debatable)


macOS can OOM, ish.

If you don't have any more disk space for swap, or memory pressure gets too high, you get the "You've ran out of application memory" dialog box with a list of applications you can force quit, and macOS leaves it up to the user on what to kill instead of the system choosing automatically.


do you also say that about hdd space? about money in the bank?

It’s counterintuitive but I learned this best by playing RTS games. If you don’t spend money your opponent can outdo you on the map by simply spending their money. But the principle extends, everything you have doing nothing (buildings units etc) is losing. The most efficient process is to have all your resources working for you at all times.

If you don't have savings to spend for a potential change of tactics, larger players, groups or players with different strategies can easily overtake you as your perfectly efficient economy collapses.

Going to also echo the comment that this isn't an RTS


> this isn't an RTS

Yep. RTS is a context where the principles are more true.

In real life you aren’t in a 1-1 matchup with competitive success criteria.


> It’s counterintuitive but I learned this best by playing RTS games. If you don’t spend money your opponent can outdo you on the map by simply spending their money.

OK, hear me out over here:

We are not in an RTS.

Edit: in real-world settings lacking redundancy tends to make systems incredibly fragile, in a way that just rarely matters in an RTS. Which we are _not in_.


Agreed. Real life is not an RTS. Optimizing computer or business resources - kind of like one.

It's why I wake up at 3am to make sure my agents aren't waiting on me :D

Why he wouldn't say it about HDD space? You buy HDD to keep them empty?

And as for the money analogy, what's the idea there, that memory grows interest? Or that it's better to put your money in the bank and leave it there, as opposed to buy assets or stocks, and of course, pay for food, rent, and stuff you enjoy?


Money analogy could better be put as one of:

1. Store your money in a 0% interest account—leave RAM totally unused—or put it in an account that actually generates some interest—fill the RAM with something, anything that might be useful.

2. Store your money buried in your backyard or put it in a bank account? If you want to actually use your money, it's already loaded into the bank.

Imperfect analogies because money is fungible. In either case though, money getting spent day-to-day (e.g. the memory being used by running programs) is separate.


Then why do you have any hard drive space available at this moment?

HDD is for storage. RAM is for speed?

Why do you even need RAM when you could run everything from your HDD with much cheaper cost/MB.


Isn't it obvious?

Because wanting to utilize something as much as you can to get your money's worth, and wanting to fully exhaust it as a resource are two different things.


It's the same in all three cases: using HDD space, money and RAM for good purposes (disk cache) is useful, wasting it (Electron) is bad.

(Weird side question: are you by any chance the Jason Farnon who wrote IBFT?)


> about money in the bank?

Yes, generally. That's the entire idea behind the stock market.


> do you also say that about hdd space?

For slightly different reasons. My game drive is using about 900 GB out of 953 GB usable space - because while I have a fast connection, it's nicer to just have stuff available.

Same for some projects where we need to interface with cloud APIs to fetch data - even though the services are available and we could pull some of the data on demand, sometimes it's nicer to just have a 10 TB drive and to pull larger datasets (like satellite imagery) locally, just so that if you need to do something with it in a few weeks, you won't have to wait for an hour.


because memory access performance is not O(1) but depends on the size of what's in memory (https://www.ilikebigbits.com/2014_04_21_myth_of_ram_1.html). Every byte used makes the whole thing slower.

I am not following, isn't this just a graph that shows that how fast operations happen is largely dependent on the odds that it is in cache at various levels (CPU/Ram/Disk)?

The memory operation itself is O(1), around 100 ns, where at a certain point we are doing full ram fetches each time because the odds of it being in CPU cache are low?

Typically O notation is an upper bound, and it holds well there.

That said, due to cache hits, the lower bound is much lower than that.

You see similar performance degradation if you iterate in a double sided array the in the wrong index first.


O notation is technically meaningless for systems with bounded resources. That said, yes the performance is depending on the probability of cache hits, notably also the TLB. For large amounts of memory used and random access patterns, assuming logarithmic costs for memory access tends to model reality better.

The author of that post effectively re-defines "memory"/"RAM" as "data", and uses that to say "accessing data in the limit scales to N x sqrt(N) as N increases". Which, like, yeah? Duh, I can't fit 200PB of data into the physical RAM of my computer and the more data I have to access the slower it'll be to access any part of it without working harder at other abstraction layers to bring the time taken down. That's true. It's also unrelated to what people are talking about when they say "memory access is O(1)". When people say "memory access is O(1)" they are talking about cases where their data fits in memory (RAM).

Their experimental results would in fact be a flat line IF they could disable all the CPU caches, even though performance would be slow.


Memory access performance depends on the _maximum size of memory you need to address_. You can clearly see it in the graph of that article where L1, L2, L3 and RAM are no longer enough to fit the linked list. However while the working set fits in them the performance scales much better. So as long as you give priority to the working set, you can fill the rest of the biggest memory with whatever you want without affecting performance.

RAM is always storing something, it’s just sometimes zeros or garbage. Nothing in how DRAM timings work is sensitive to what bits are encoded in each cell.

> Every byte used makes the whole thing slower.

This is an incorrect conclusion to make from the link you posted in the context of this discussion. That post is a very long-winded way of saying that the average speed of addressing N elements depends on N and the size of the caches, which isn't news to anyone. Key word: addressing.


Huh? There is nothing called "empty memory". There is always something being stored in the memory, the important thing is whether you care about that specific bits or not.

And no, the articles you linked is about caching, not RAM access. Hardware-wise, it doesn't matter what you have in the cells, access latency is the same. There is gonna be some degradation with #read/write cycles, but that is besides the point.


why is it not O(1)? It has to service within a deadline time, so it is still constant.

In macOS 15 there are two metrics: "Memory used" and "Cached Files"

I'm specifically talking about "Memory used" here.

In fact, on my 16GB mac, if I open apps that use ~8GB of RAM (on top of the 5GB I mentioned earlier), it starts swapping.


When you open up Activity Monitor, to the immediate left of the "Memory Used" and "Cached Files" that you see, you'll see the Memory Pressure graph that the guy above is talking about.

On my 64 GB M1 Macbook Pro right now, I have 53.41 GB of Memory Used and 10.72 GB of Cached Files and 6.08 GB of swap, but Memory Pressure is green and extremely low. On my 8 GB M1 Macbook Air I just bought for OpenClaw, I'm at 6.94 GB Memory Used and 1.01 GB of Cached Files with 2.05 GB of Swap Used, and Memory Pressure is medium high at yellow, probably somewhere around 60-70%.

You can open up the Terminal and run the command memory_pressure to get much more detailed data on what goes into calculating memory pressure - more than just the amount of swap used, it tracks swap I/O and a bunch of page and compressor data to get a more holistic sense of what's going on and how memory starved you're going to feel in practice.

In any case - I've been absolutely mindblown at how fast my 3 8GB M1 Macbook Airs I just bought for ~$350 brand new have been - even with tons of Chrome tabs open, multiple terminal windows open, running OpenClaw and Claude Code and VS Code and doing a ton of development and testing, never once have they ever felt slow. Oftentimes they actually feel faster than my 64 GB M1 Macbook Pro, which kind of blows my mind and makes me wonder wtf is going on on my monster machine. Moreover, my M1 Macbook Pro drains battery like crazy and uses a ton of charge, whereas the Macbook Airs stay constantly below 10 watts essentially always and even with Amphetamine keeping them on 24/7, with the display off and being fully on, they'll drop to a single watt of power draw. Truly insane stuff. I've lost all my concern about RAM, to be honest (which is shocking coming from someone who bought a top of the line maxed out RAM primary machine in 2021 specifically because I felt like RAM was so important)


> I've been absolutely mindblown at how fast my 3 8GB M1 Macbook Airs I just bought for ~$350 brand new

Wait what? How did you manage that?


MacBook Air M1 was released six years ago. That’s pretty expensive for such an old machine!

They hold value very well. Still a perfectly good machine today and probably a better deal than the neo if you find one in good condition

don't thinkpads from the similar time go for the same amount of money? seems like an alright price for a machine of that vintage, although thinkpad is obviously superior here since it would always be able to run linux or windows (well that one is not guaranteed) without much, if any, trouble

OpenClaw found some sweet deal? /s

Yes, the person you are replying to has explained that.

The old mental model of how ram and swap works doesn't fit neatly to how modern macos manages ram. 8GB is acceptable, although on the lower end for sure.


The old mental model doesn't fit how any OS manages RAM. Every OS plays all sorts of fun guessing games about caching, predicting what resources your program will actually need etc. The OS does a lot of work to ensure that everything just hums along as best as possible.

How do you define "swapping?" Even on Intel Macs, the memory statistics don't map the way one might expect. Be careful when making assumptions about what those metrics actually mean.

I mean at that point (13 GB memory used), the "Swap used" is at several hundred megabytes.

And if I more apps (or browser tabs), the "Swap used" keeps increasing, and the "memory pressure" graph switches color from green to yellow.

The color of that graph is the indicator I'm using to know that I should close my browser tabs :p


After several days of usage, Activity Monitor will usually shows that "WindowServer" is using 6 GB of RAM.

Yeah, 8 GB RAM does not cut it anymore. At least until Apple start fixing the memory leaks in MacOS.


Unused RAM is wasted RAM. If your machine isn't reporting memory pressure and/or the user isn't experiencing pageouts, then the machine is well-suited to the user's workload.

I'd rather my ram go to my page cache, not have bloated apps hoarding it.

But I thought Electron was the future?

Uptime of 13 days. My WindowServer is at 441Mb. ??? (32Gb RAM M2Max MBPro)

Probably a badly behaved app. Run `IOAccelMemory` to check.

Have you tried turning it off and on again?

I remember when Windows Vista had to contend against the same allegations when it was released. It did have a higher memory footprint, but a lot of the ridiculous usage numbers people had published were the SuperFetch just precaching commonly used programs to give better application startup times.

Tbf when Big Sur was released, it was leaking like crazy. It was a daily ritual for me to kill Dock and Finder after they've eaten all my RAM.

Ha, wasn't it windows vista that allowed you to plug an SD card to use for swap space/fake ram?

SpeedBoost was supported by vista through windows 10, and although windows 11 regognises a speed boost USB, I do not know it it uses it. When I put windows 11 on two i5 8gb machines and plugged in two speed boost drives, it did not swap a lot to them, whereas in windows 7, under memory load it would use them, at least until I found ChacheMem v2.1 it would manage memory much better than windows ever could.

Windows back to window 2.1 386 supported swapdisks, i.e fake ram.


I found Google Chrome makes an M1 MacBook Air with 8GB RAM almost unusable, unless you're really careful to keep only a few tabs only. I'm curious what browser you were using and if you had any similar experience.

I have heard this before and am curious what kind of sites you open in the tabs?

I have a 8GB m1 mac mini and I dont see any issue with browsing in chrome (right now I have 11 tabs open).


It was my son’s laptop , he’s in high school. General Google Classroom / Google Docs / Gmail / web research stuff. He’s not technical at all. I bought him the 8GB machine thinking it would be fine, but it became a big problem for him.

I do think part of the problem was number of tabs open. It was a little better when I taught him how to manage tabs and I also turned up all the memory saving features in chrome.

But even with all of that, it would still slow down with what looked like a pretty minimal workload.

I spent a few hours with him on it, but he still had these kinds of issues.

It just seems like it requires a decent level of sophistication to work with a small RAM budget if you’re using Google software.


No you don't work just fine with all that or you aren't doing much.

Your SSD is swapping like crazy and will die really fast.

Just rust plugin in vs code uses 3gb of ram.

Add a browser, and you are already over 8GB.


> Your SSD is swapping like crazy and will die really fast.

Just how quickly do you think the SSDs will die? Because there are a lot of 8GB M1 machines out there that have been getting daily use for five years, mostly with 256GB or 512GB storage configs. When do you expect them to fail?


Depends of the usage. I just know that with 48GB I have a few TB or tens of TB written within a couple of hours when playing with homomorphic search.

So you're predicting that 8GB machines will fail prematurely based on extrapolation from an extreme niche use case that doesn't remotely fit on those machines?

Using an M2 8GB Mac Mini, I only ever ran into problems when trying generative fill in Photoshop. There I get insufficient memory errors if the selection is too large.

> I'm typing this on an 8 GB MacBook Air and it works just fine.

Most cool. Is it an M1?


Not the OP, but I have an M1 MBA and it handles light "coding" stuff quite well, though haven't tried VSCode+Zoom+bunch of other stuff, as my work laptop is a M1 MBP.

Same. I've been programming in Go on an M1 for years and perf is spectacular.

Early 2020 i3 macbook air


It also compresses memory. Many things in ram compress really well.

Memory compression is a feature on Windows PCs for years (decades maybe?), it somehow doesn't prevent people from raising valid complaints about swapping with 8Gb or RAM.

I wonder, why is it physically painful for some Apple owners to admit that 8Gb is not enough. Like, I'm using PCs for years and I will be the first in line to point their deficiencies and throw a deserved stone at MS, they never cease to provide reasons. Why is it so different at the Apple?


Because 8GB is literally enough? There are multiple 8GB Macs in this house and they are fine. I wouldn't use them for development work but they're completely competent at the basics.

What's basics? Of course one can always overbuy hardware compared to the tasks but we are discussing some usage more fitting to the laptop form factor. I would argue that for a laptop a basics is at least some kind of office white collar work or similar. And so it is most likely that at least 2-3 of the Electron monstrosities would be used, an office package or something along the lines, multiple loaded tabs in a browser a few of which will be memory leaking enterprise crap, a few communication apps etc. Nothing really outlandish, only handful of apps, but because they are all fat, they will eat the 3Gb margin super fast and start caching.

The storage is fast enough to not be too much of an issue, and the basics would be mostly a web browser, a lot of things can be done with only it, and if you need to do more than web browser, text editors, you probably should want more than the Neo in the first place

Tons of 8GB users out there who are happy. I'm on 16GB and its definitely enough for a power user - and running multiple coding environments, Docker, IDE's. MacOS is really good with caching.

> I wonder, why is it physically painful for some Apple owners

This wasn't necessary. I was just pointing out that 8GB hardware is not the full story. It's also true with windows, as you correctly point out. If you're coming from a slow SSD, or even Linux (it's a relatively new feature to have on by default) you might be pleasantly surprised.

Also, I'm an Apple owner and I have no problem saying it's not enough for anyone on this website. I tried it for a few years as my "second screen" computer, and would bump against it all the time, with glorious screeching as the audio skipped. But, I'm also a developer/power user.

The majority of people aren't power users.and that's the target audience for this. Clearly.

8GB has been completely fine for every non power user I know. Again, the majority of people do everything within a browser, maybe play some music/video at the same time, maybe open an office type app. It's completely acceptable for that, and that should not surprise you, as someone who has an understanding of memory usage and paging, and high bandwidth SSDs, in the slightest.


Gruber said something[0] that parallels your point, and emphasizes the target audience for this Mac:

> If you know the difference between sRGB and P3, the Neo is not the MacBook you want.

Apple has made extensive tradeoffs to make this price point, but they all seem to be reasonable tradeoffs for casual users.

[0]: https://daringfireball.net/2026/03/599_not_a_piece_of_junk_m...


Perhaps because it's enough for a lot of things. I only came up against the 8GB limit when I ran a LLM locally using Ollama. It worked but wasn't workable.

8GB isn't ideal though and 16GB would've expanded its capacity to do more things. But soon as I want to do more things I shuffle over to my PC with it's dedicated GPU and 32GB o ram

I'm guessing Apple cuts capability to the lower end so as not to hurt sales of the higher end. Usage profile is often dependent on context. There are enough non-power users (when mobile) like me that 8GB isn't ideal but it's enough. And if it wasn't enough we could've paid more for the 16GB, but I personally decided it wasn't worth the ridiculous Apple ram price premium.

So these are my reasons for saying 8GB is enough. I'm also using an M1 MacBook Air, so the puniest of the lineup. Next laptop I'm considering is possibly a think pad with linux so I'm no macOS fanboi.


Linux does this too. It uses 100% of your ram always, using free memory as cache.

Actually figuring out free RAM is kind of confusing.


Very confusing.

> I've got ChatGPT, VSCode, XCode, Blender, and PrusaSlicer minimized and I'm not feeling any lag.

Obviously, when they're dumped, then there's nothing to notice.

> but when they're not in the foreground they're not using up any memory.

... well ... yes.


What are you slicing?

What do you find compelling with Prusa slicer over orca slicer?


I'm printing a new multi-laptop stand that can accommodate a work laptop I've just received. I've actually never used Orca, PrusaSlicer is the first one I tried and it's done everything I've needed.

There's a lot of different kinds of "using". "Memory pressure" includes some kinds of caching (ie running idle daemons when they could get killed) and not others (file caching). And there are also memory pressure warnings (telling processes to try to use less memory), so there's a lot of feedback mechanisms.

I don't suggest sitting and looking at Activity Monitor all day. I think that is a weird thing to do as a user. If you would like to do that in an office in Cupertino or San Diego instead then you can probably figure out where to apply.


i think the main point that GP was trying to make is that depending on the workload 8gb of memory might not be an issue.

the keywords here are "depending on the workload".

edit: i was thinking that it's gonna be interesting to see i/o performance on storage, that might end up determining if those 8 gigabytes are actually decent or not.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: