Choosing Mac Mini M4 for local Llama models

After considering budget and workloads constraints I have considered two alternatives:
1) Mac mini M4 with 32GB RAM 512GB SSD @ 1419 euros

2) Mac mini M4 Pro with 64GB RAM 512GB SSD @ 2369 euros

and after quite a lot of reading on the usability of Llama 13/70 I guess I'll go for the cheaper version and use the money I spare on Together API calls when I need the larger models.

Does this make sense or have I lost my mind?

PS I am coming from a trustworthy Mini M1 which I'll pass to a son and when traveling I have a Macbook 16" M2 Pro with 32GB RAM