Use Local Ollama Devstral Model With Zed

I started playing with Zed an editor for programming with AI agent support. Since it allows us to run local models via ollama I played around with it, here are the steps to reproduce it on macOS:

brew install ollama
brew services start ollama
ollama pull devstral
ollama run devstral

# dumb test if it does something
curl http://localhost:11434/api/generate -d '{
  "model": "devstral",
  "prompt": "Generate a REST API endpoint in Flask",
  "stream": false
}'

Once in Zed, it auto-detects ollama running, and the availability of the model. Here’s the config block it added to it’s ~/.config/zed/settings.json

{
  "agent": {
    "model_parameters": [],
    "default_model": {
      "provider": "ollama",
      "model": "devstral:latest"
    },
    "version": "2"
  }
}

As of June 2025 the agentic part does not work well (i.e. create files and directories), I’ve seen colleagues use paid LLM backends with Zed that do better. But helping inline with code blocks or translating from programming-language A to B was doing fine. And I am not willing to send my code to the cloud, therefore I am happy with my current setup.