Run DeepSeek-R1 locally with Ollama on DockerJanuary 30, 2025 · 2 min read Would you like to run local LLMs, here is how you can do it with Ollama and the new DeepSeek-R1 model that is breaking the boundaries of AI. 🚀 For those of us passionate about pushing the boundaries of AI, this is a game changer. 💡