Today, I tested Ollama and Locally AI more extensively on my M2 MacBook Air, and it was quite demanding. It’s no surprise that a serious local AI setup requires an Apple M5 Pro or M5 Pro Max with at least 16 or 24 GB of RAM. My M4 Pro Mac mini has 24 GB, and I could use it remotely through an SSH session. This experimentation puts any plans to replace my aging M2 MacBook Air into perspective.