Coding The New Way
I recently dove into Agentic Engineering with a course called “Principled AI Coding” by IndyDevDan. The course was recommended to me by A.D. Slaton of Stillriver Software Solutions and centers on using Aider, a powerful CLI tool for AI-driven coding that typically pairs with APIs from providers like OpenAI and Anthropic.
This got me thinking: could I swap the paid cloud APIs for a self-hosted, open-source LLM running locally?
The answer is yes, and it’s surprisingly simple.
Local LLMs With Ollama & Deepseek
By using Ollama, you can run powerful models on your own machine and pipe them directly into Aider. While several models are available, I’ve had the best results for coding tasks with deepseek-coder-v2.
The Setup
Aider’s official installation docs are solid, so I’ll skip to the configuration. Here’s how you can switch between cloud APIs and a local model.
Standard Cloud API Setup
For the course, you’ll set your API keys in a .env file. Aider will automatically detect them.
OPENAI_API_KEY=<your-openai-key>
ANTHROPIC_API_KEY=<your-anthropic-key>
Local Ollama Setup
To use a local model with Ollama, simply change your .env file to point to your local server and specify the model.
OLLAMA_API_BASE=http://localhost:11434
AIDER_MODEL=ollama/deepseek-coder-v2:latest
That’s it! It’s a simple way to leverage a powerful local setup.
A Word of Caution: If you plan to follow the “Principled AI Coding” course exactly, I recommend sticking with the suggested cloud models (GPT-4, Claude 3, etc.). In my experience, local models don’t yet match the coding proficiency of their top-tier counterparts, and you may struggle to replicate the instructor’s results.
Have you tried this method? Share your favorite local models or any tips in the comments below!