If you're looking to run DeepSeek (specifically deepseek-coder
) offline on your Ubuntu machine without worrying about GPUs, you're in the right place!
✅ Works even without a GPU
✅ Just 2 simple commands
✅ Completely offline usage after download
🔧 Step 1: Install Ollama
Run this command in your terminal to install Ollama:
curl -fsSl https://ollama.com/install.sh | sh
✅ This installs Ollama and sets it up as a system service.
📌 You may see a message like:
WARNING: No NVIDIA/AMD GPU detected. Ollama will run in CPU-only mode.
No problem! It still works perfectly on CPU-only systems!
🧐 Step 2: Run DeepSeek Coder Locally
Now, run the DeepSeek LLM (offline) using:
ollama run deepseek-coder
Ollama will automatically download the model the first time and run it locally after that.
💻 Features
- Runs entirely offline after the first model download
- Works on CPU or GPU
- Lightweight installation
- Perfect for offline development, coding, or learning AI locally
🎮 YouTube Tutorial
If the embed does not render correctly, you can also watch it here:
👉 Watch on YouTube
💬 Questions?
Feel free to comment or connect with me on LinkedIn or follow more updates on bhuvaneshm.in!
Top comments (1)
Now that DeepSeek is running offline, what other LLMs should we try next on Ollama? Maybe code-specific models like Code Llama, or try fine-tuning a small model locally? Drop your ideas below!