The Free "Copilot Killer": Setting Up DeepSeek in VS Code
Quick Summary: Key Takeaways
- The Cost: $0.00/month (Goodbye, $10 Copilot subscription).
- The Tool: We use the open-source Continue extension for maximum control.
- The Flexibility: Switch instantly between local (Ollama) and cloud (API) models.
- The Result: Full autocomplete, chat, and refactoring directly in your editor.
You have the model. Now you need the workflow. Running a model in a terminal is fun, but it doesn't help you ship code faster.
To truly replace GitHub Copilot, you need seamless integration into your Integrated Development Environment (IDE). This guide is the practical implementation layer of our comprehensive The DeepSeek Developer Ecosystem: Why Open Weights Are Winning the 2026 Code War.
We are going to use free, open-source tools to bridge the gap between your raw DeepSeek model and your daily coding environment. Let's save you some money.
Step 1: The Secret Weapon (Continue.dev)
Don't use the generic "AI" extensions that lock you into a specific provider. We recommend Continue.
It is the leading open-source autopilot for VS Code that allows you to bring your own model (BYOM).
How to Install?
- Open Visual Studio Code.
- Click on the Extensions icon (Sidebar).
- Search for "Continue" (by Continue.dev).
- Click Install.
- Once installed, you will see a new sidebar icon that looks like a "play" button.
Step 2: Connecting the Brain (DeepSeek)
You have the two choices here: Local (Privacy) or API (Power).
Option A: The Local Route (Free & Private)
If you followed our previous guides, you already have Ollama running.
- Open the Continue extension settings (
config.json). - Look for the "models" section.
- Add DeepSeek as your provider:
{
"title": "DeepSeek Coder",
"provider": "ollama",
"model": "deepseek-coder:6.7b"
}
Save the file.
Option B: The API Route (Fast & Lightweight)
If your laptop struggles with local inference, use the DeepSeek API. Sign up on the DeepSeek developer portal and Generate an API Key.
In config.json, set the provider to "deepseek" and paste your key.
Step 3: Mastering the Shortcuts
The "Copilot Killer" experience relies on speed. Here are the essential shortcuts to speed up your workflow:
- Autocomplete: Just type. DeepSeek will suggest code in gray ghost text. Hit Tab to accept.
- Chat: Press Cmd+L (or Ctrl+L) to highlight code and ask questions like "Explain this bug."
- Edit: Press Cmd+I to instruct the model to rewrite a selection (e.g., "Refactor this function to be async").
Going Beyond Basic Autocomplete
Standard completion is great for boilerplate. But generic models might struggle with your company's proprietary frameworks or legacy spaghetti code.
Need it to understand your specific repo? Learn to fine-tune your own model in our next guide: Train Your Own Coder: Fine-Tuning DeepSeek on Private Repos.
Conclusion: Zero Cost, Maximum Code
You have now successfully replicated the GitHub Copilot experience for free. Your IDE is now connected to a state-of-the-art Large Language Model.
You have better privacy, no monthly fees, and the freedom to swap models whenever a better one comes out. Happy coding.
Frequently Asked Questions (FAQ)
You need to install an extension that supports custom models. We recommend Continue, but Twinny is another lightweight open-source option for local models.
Yes. By running DeepSeek Coder via Ollama and connecting it to VS Code, you get unlimited, free code completion running entirely on your own hardware.
You must edit the config.json file in the extension. Set the "provider" to "ollama" and the "model" to "deepseek-coder" (or whichever version you pulled).