What's new

The Apple quandary (minor rant)

@wayne_rowley if you don't need the last gen why not simply buy an M1 Mac Studio?

The base one with an M1 Max comes with 32GB I think.

Maybe get it second hand or refurbished.
 
@wayne_rowley if you don't need the last gen why not simply buy an M1 Mac Studio?

The base one with an M1 Max comes with 32GB I think.

Maybe get it second hand or refurbished.
It’s an option! The prices on the base M1 Studios refurbished are pretty good, and they are still very good computers. The only thing that puts me off is how long Apple will support them with OS updates now that the M3 is out. It’s why I favour an M2 Pro Mini.
 
Apple is of course happy to take our money, like any other company. But this is not a conspiracy. They decided to integrate the memory to get massive bandwidths and sharing the memory with the GPU.

Intel CPUs only go up to 94GB/s.

IIRC the AMD Epyc CPUs can get as high as something like 200GB/s.

The Apple M3 Max goes up to 400GB/s.


Yep. It's never coming back.
Both AMD and Intel have workstation chips with eight channels of DDR5, which is well over 200GBs.
Epyc has up to 12 channels.
But these are supporting 128 cores and I think AMD is soon jumping to 192 cores.
So for audio tasks on a chip with 16 or so performance cores, I can't see that bandwidth is an issue.

The latency of the memory system is also very important and how much of that bandwidth is available in practice to a single thread.
M1 really impressed in that regard I recall and that is more useful than more bandwidth than you can utilise.
 
The only thing that puts me off is how long Apple will support them with OS updates now that the M3 is out. It’s why I favour an M2 Pro Mini.
Nobody knows but my hunch is around 7-10 years just like the Intel machines.

I'd be surprised if it was less than 7 years which is what iPhones get.

Both AMD and Intel have workstation chips with eight channels of DDR5, which is well over 200GBs.
Epyc has up to 12 channels.
Another measure is per core bandwidth, which AFAIK Apple is way ahead of everyone else.

The latest Xeon goes above 1GB/s per core. The M1 from 2020 can get above 100GB/s per core. See the attached images.

Source

Source
 

Attachments

  • 1700000333639.png
    1700000333639.png
    673.8 KB · Views: 4
  • 1700000375461.png
    1700000375461.png
    41.2 KB · Views: 4
Nobody knows but my hunch is around 7-10 years just like the Intel machines.

I'd be surprised if it was less than 7 years which is what iPhones get.
From when they were released or from today? 7 years from release seems fairly standard from Apple, longer for the Mac Pros and perhaps for the Studios to.

But a refurbished M1 system purchased today might only get supported OS updates for another 3-4 years.
 
From when they were released or from today? 7 years from release seems fairly standard from Apple, longer for the Mac Pros and perhaps for the Studios to.

But a refurbished M1 system purchased today might only get supported OS updates for another 3-4 years.
From release date.

The M1 Mac Studio was released in 2022 though.
 
True, but only to a point. 24GB won’t handle full BBC Pro
I get on just fine, as a fellow hobbyist mind you, with BBCSO Pro on M1 16GB RAM. Of course it depends on how "heavy" you mean: if I wanted several mic signals on every instrument in a large orchestral piece, I would expect to freeze tracks... I just haven't needed to.

As amazing as Apple SoC is, it can't fit 64GB of data on 24GB of RAM. Sure, it can swap it out and stream it 'very quick' but it's not the same and everyone knows that.
I said you could avoid preloading as much in the first place. I didn't say you could fit 64GB of data into 24GB of RAM, and I didn't say anything about swapping. Next time you speak to everyone, could you ask them how they define "the same"?

Really wish people wouldn’t say this. Fast swapping is useful but it’s not a substitute for more ram and unified memory isn’t magic.
I really wish people would read what I wrote, rather than imagining I said anything about swapping. :) Seriously, I appreciate that fast swap is a common explanation for why Apple silicon requires less memory for general applications, but it's absolutely not the argument I was making. Swap should be irrelevant to sampled instruments (and if it is happening, something's probably broken.)

Instead, I was saying: don't fill your memory up with samples in the first place. You only need to pre-load enough sample to cover the time between pushing the key and getting the rest of the sample data from disk. If your disk (and intervening machinery) can deliver that data quickly enough on-demand, then that pre-load requirement can be very small indeed.
 
Apple is of course happy to take our money, like any other company. But this is not a conspiracy. They decided to integrate the memory to get massive bandwidths and sharing the memory with the GPU.

Intel CPUs only go up to 94GB/s.

IIRC the AMD Epyc CPUs can get as high as something like 200GB/s.

The Apple M3 Max goes up to 400GB/s.


Yep. It's never coming back.
I think your info may be out of date? Also, is the Apple M3 Max actually equivalent to a "desktop" processor from Intel or AMD, or is it more equivalent to, say, a Xeon or Epc? I think the Xeon is around 300 and Epyc is around 400 now. But maybe those are clusters of processors or something - I don't follow that closely.

Given the costs, I'd pit it more against the high end of both, though I've no doubt Apple is more power efficient. I don't see Apple's lead in any measure except power efficiency being maintained over the medium term, but surprises happen. Of course, efficiency is a wee bit important for laptops, right? Hehe. A laptop with something equivalent by Intel would probably look like the Delorean from Back to the Future at this point until they can shrink the microns further.

Intel is throttling/not optimizing bandwidth in the comparatively low-end processors and Apple is throttling the amount of RAM they ship with. As always, pick yer poison.
 
Last edited:
True, but only to a point. 24GB won’t handle full BBC Pro or a heavy Cinesamples template (my two large orchestral libraries).

Besides, with the Apple pricing ladder an M2 or M3 with 24 GB gets very close to a Pro with 32/36GB.
M2 Max MBP user here with 96 gigs of ram. BBC SO pro template with all instruments loaded and three mics enabled on each instance takes a little under 20gigs of total ram. Have preloads set as low as they go. Loads quick and have had no issues. While the ram can’t hold more than what you have, lowering the preloads on all the sample players can make a big difference. My OT Berlin template that used to run at roughly 105 ish gigs on my 2013 Mac Pro runs around 60ish gigs on the the MBP.

When I went to upgrade I wanted something portable for a while and just maxed everything out as I always want room to grow. However, knowing what I know now, if I was in a position where I needed to save, I could have been fine with 24-32 gigs for most of the projects I have completed since purchasing.
 
I know you want a simple setup - perfectly understandable. I too despise Apple's price-gouging premium on RAM and limited options.

But, to solve the problem of going Mac without breaking the bank: if you already have an i5 that has, or can support 64G or more, add VEPro and run it as a slave. You would only need a network switch, no additional audio hardware.

Then you can make the most of running your DAW of choice on the Mac, and offload samples to the PC.
I've been doing this for almost a year with DP (previously ran multiple PCs with Nuendo), and love it. I only have a Macbook Pro M1, first gen, with 16G Ram, which I picked up used for $800, and it flies. Slave is an older i7 with 64G. I prefer this to localized sample hosing as it means I never have to wait for projects to load.

I've mixed several films, written scores, working on a pro-audio video series, etc. Not one problem running DP, Logic, Studio One, ProTools or Nuendo in this configuration.

I do suggest a CalRec dock though for iLok keys, USB/TB interface, 4k/60Hz displays, etc. Not cheap, but they work where most hubs don't.
 
Last edited:
I get on just fine, as a fellow hobbyist mind you, with BBCSO Pro on M1 16GB RAM. Of course it depends on how "heavy" you mean: if I wanted several mic signals on every instrument in a large orchestral piece, I would expect to freeze tracks... I just haven't needed to.
Where do you store your samples? I’m currently using external SATA SSDs over USB.
 
OP?

Also, enabling only tracks you need as you need them can save a ton of RAM.
No, preload buffers are currently default. However I’m running on a 2018 mid range Mac Mini that streams from external SATA drives over USB.
 
M2 Max MBP user here with 96 gigs of ram. BBC SO pro template with all instruments loaded and three mics enabled on each instance takes a little under 20gigs of total ram. Have preloads set as low as they go. Loads quick and have had no issues. While the ram can’t hold more than what you have, lowering the preloads on all the sample players can make a big difference. My OT Berlin template that used to run at roughly 105 ish gigs on my 2013 Mac Pro runs around 60ish gigs on the the MBP.

When I went to upgrade I wanted something portable for a while and just maxed everything out as I always want room to grow. However, knowing what I know now, if I was in a position where I needed to save, I could have been fine with 24-32 gigs for most of the projects I have completed since purchasing.
That’s helpful. Are those low preload buffers dependant on samples being streamed from the internal SSD?
 
That’s helpful. Are those low preload buffers dependant on samples being streamed from the internal SSD?
No. I'm streaming from NVMe Thunderbolt (2 lane) drives, running Kontakt minimum preloads, never a problem. Projects load quicker that way, too. I'm on M1 Mac Studio, DP11.

Any SSD, SATA or TB should serve VI samples from min preload just fine.

No, preload buffers are currently default. However I’m running on a 2018 mid range Mac Mini that streams from external SATA drives over USB.
So, your current RAM requirements are more substantial than they will be with min preload. Try it with your current setup. You may be pleasantly surprised. And you can use teh data when calculating what you'll need in your next computer.

When I was running a 5,1 Mac Pro, I used large preloads as I had lots or RAM and wanted to eek as much performance from those again CPUs as possible. Larger preload theoretically means lower CPU hit, or at least that was my operating theory at the time...
 
Where do you store your samples? I’m currently using external SATA SSDs over USB.
BBCSO is on external SSD (Samsung T7) which connects as USB 3.1; this is an M1 from 2020, though, so you'll be able to do better today. Preloading 1/4 the Spitfire default works fine, and I've run with lower - especially if using larger rendering buffers. (I do a lot of experiments, so please do corroborate numbers with others in case I've mis-remembered.)

For convenience, I also keep some smaller libraries like Colossus on the internal SSD: those run with even homeopathic amounts of preload. I was surprised how difficult it was to break them. It's not cost-effective to keep samples on internal SSD, but it tells me that a Thunderbolt SSD - considerably more affordable than they were three years ago - might be a smart move if it ever became limiting.
 
Users that need more than 16GB of RAM are probably less than 10% of Mac users.
I think that's an underestimation, as a *lot* of software developers use Macs these days, and 16GB of RAM is nowhere near enough for anything serious. It's still likely a minority, but Apple know that the employers will 'pay up' for the additional specs, and so don't care.

You're correct in the sense that it's supply and demand... personal/home users don't need much, and business/professional users tend to have the 'expenses' to pay the higher premium, as a write-off.

The people who suffer are those of use wanting to use a good machine, but not able to write it off as a business expense.
 
I think that's an underestimation, as a *lot* of software developers use Macs these days, and 16GB of RAM is nowhere near enough for anything serious. It's still likely a minority, but Apple know that the employers will 'pay up' for the additional specs, and so don't care.

You're correct in the sense that it's supply and demand... personal/home users don't need much, and business/professional users tend to have the 'expenses' to pay the higher premium, as a write-off.

The people who suffer are those of use wanting to use a good machine, but not able to write it off as a business expense.
Software developers must be a tiny minority of users surely?
 
Software developers must be a tiny minority of users surely?
Probably... although you'd also include any content creators as well. Any professional use, basically, which was previously mentioned, but it seems (from my perspective) that far more businesses give their employees high-spec macbooks, than I remember 10+ years ago.

Honestly, without true figures from Apple as to demographics and actual cost - which we'll never get - it's all just guesswork anyway.

It's absolutely true that most users of most devices don't care about the specs - they just want the latest shiny that has a web browser and/or the social media apps. Gamers will buy PC or consoles.
Cloud services are all Linux, often hosted by Amazon who, like the other big tech companies will build their own hardware these days.

The days of 'personal computer' are moving further away. Sure, we all have phones that are computers now, but they're all moving towards being remote access for cloud services as well.
 
I think that's an underestimation, as a *lot* of software developers use Macs these days, and 16GB of RAM is nowhere near enough for anything serious.
Whoa there :) Not sure what counts as serious or what you work on yourself, but I've worked on big data applications, simulation, signal processing, etc, all on 16GB or less. They might use 100s GB in production, but it's rarely good practice for devs to work against full-sized data-sets at all, and certainly not on their own machines.

it seems (from my perspective) that far more businesses give their employees high-spec macbooks, than I remember 10+ years ago.
This does happen all the time, but IME that isn't evidence the specs are necessary: it's evidence that most firms haven't the slightest clue what they're doing when issuing IT to employees. (Clueful firms may still issue over-specced hardware, since aggressively standardising usually saves more money than buying cheaper models for some roles and more powerful units for others.)

I think you've put your finger on two important sources of "spec inflation", though, and Apple's "Mn Pro" nomenclature doesn't help. When the M2 line-up was released, several reviewers concluded that the M2 Pro was unnecessary for most pros.

(Of course that didn't stop my inner Gollum from wanting the precious anyway.)
 
Top Bottom