If you’ve read that Apple Silicon Macs are great at AI inference because unified memory lets all the system memory be used, there’s some nuance to the discussion. It’s mostly true. We did some testing here to see how much memory the system would allow for GPU use:
M1 with 16GB: 11GB (71.58%)
M4 Pro with 24GB: 17GB (71.58%)
M4 Max with 48GB: 38GB (80.53%)
=> More informations about this toot | View the thread | More toots from marcedwards@mastodon.social
text/gemini
This content has been proxied by September (3851b).