@dschaub love it! If I may share one nitpick: I think this chart would be even better if the lines continued horizontally until the next data point and then went straight up. (In plotly you can achieve this with line_shape=“hv”.) It more accurately represents what happened, since there was no slow climb from one level to the next but rather a jump.
You are absolutely correct that slopes are sub-optimal.
It would still be wrong as it treats each year as a single atomic point of time.
Next one I build, I'll consider that change.
Welcome to the new normal.
I'm sure in the 2030s we'll be complaining about 16GB not being enough, but that's next decade's problem.
Given that most Apple Intelligence features are supposed to work (if not well) on 8GB Mac models, I think we can trust that that isn't the case?
That's really hard.
At any time there is a huge range consumers can pay for RAM, due to type, functionality, density, speed, latency, etc...
We have no idea what kind of deals Apple gets for its RAM.
The chips are fairly standard, but DDR5X hasn't been really consumer RAM either.
The error bars might be wider than the chart.
@dschaub i mean... median for the time period would at least be sufficient? it was mostly a joke, but now I'm actually curious
@dschaub and/or the upgrade price per mb of ram (for the top end available?). I guess what i'm getting at is the prices charged for ram by apple are high.
I roughy answered that in this post:
https://mstdn.social/@dschaub/111813544514647792
... around a TB of memory, which is obviously not needed.
It isn't that the slope needs to continue at that rate, it is just that it can't stay flat forever.