Setup Ollama for running self-hosted LLMs on macOS I needed a self-hosted LLM to run on my M4 Mac. This is how I did it.