Solutions for common issues with RagCode MCP server.
| Problem | Quick Solution |
|---|---|
| âCould not connect to Qdrantâ | docker start ragcode-qdrant |
| âOllama model not foundâ | ollama pull phi3:medium && ollama pull mxbai-embed-large |
| IDE doesnât see RagCode | Re-run ./ragcode-installer -skip-build |
| Indexing stuck | Check logs: tail -f ~/.local/share/ragcode/bin/mcp.log |
Cause: The file_path parameter is missing or points outside a recognized project.
Solution: Provide a valid file_path inside your project:
{
"query": "search query",
"file_path": "/path/to/your/project/file.go"
}
Why this happens: RagCode uses file_path to detect which workspace youâre working in. Without it, it defaults to /home which is not a valid project.
Cause: Docker is not running or the Qdrant container is stopped.
Solution:
# Start Docker (Linux)
sudo systemctl start docker
# Start Qdrant container
docker start ragcode-qdrant
# Or restart everything
~/.local/share/ragcode/start.sh
Verify:
docker ps | grep qdrant
# Should show: ragcode-qdrant ... Up ...
Cause: Required AI models have not been downloaded.
Solution:
# Download embedding model
ollama pull mxbai-embed-large
# Download LLM model
ollama pull phi3:medium
# Verify
ollama list
If using Docker Ollama:
docker exec ragcode-ollama ollama pull mxbai-embed-large
docker exec ragcode-ollama ollama pull phi3:medium
Causes:
Solutions:
# In config.yaml
llm:
model: "phi3:mini" # Instead of phi3:medium
workspace:
exclude_patterns:
- "vendor"
- "node_modules"
- ".git"
- "dist"
- "build"
- "*.min.js"
- "*.bundle.js"
tail -f ~/.local/share/ragcode/bin/mcp.logCause: MCP configuration file is missing or incorrect.
Solution 1: Re-run installer
~/.local/share/ragcode/bin/ragcode-installer -skip-build -ollama=local -qdrant=docker
Solution 2: Manual configuration
Check your IDEâs config file exists and has correct content:
| IDE | Config Path |
|---|---|
| Windsurf | ~/.codeium/windsurf/mcp_config.json |
| Cursor | ~/.cursor/mcp.config.json |
| VS Code | ~/.config/Code/User/globalStorage/mcp-servers.json |
| Claude Desktop | ~/.config/Claude/mcp-servers.json |
See IDE-SETUP.md for complete configuration examples.
Cause: Ollama embedding model is not responding correctly.
Solution:
# Restart Ollama
docker restart ragcode-ollama
# or
systemctl restart ollama
# Test embedding model
curl http://localhost:11434/api/embeddings -d '{
"model": "mxbai-embed-large",
"prompt": "test"
}'
Cause: Large models or multiple workspaces indexed.
Solutions:
phi3:mini instead of phi3:mediumall-minilm instead of mxbai-embed-largesudo fallocate -l 8G /swapfile
sudo chmod 600 /swapfile
sudo mkswap /swapfile
sudo swapon /swapfile
Cause: User not in docker group.
Solution (Linux):
sudo usermod -aG docker $USER
# Log out and log back in
Solution (macOS):
Causes:
Solutions:
# Look for collection in Qdrant
curl http://localhost:6333/collections | jq
index_workspace tool with file_path parametersearch_code before hybrid_searchProblem: Windows IDE canât find WSL binary.
Solution: Use wsl.exe wrapper in config:
{
"mcpServers": {
"ragcode": {
"command": "wsl.exe",
"args": ["-e", "/home/YOUR_USERNAME/.local/share/ragcode/bin/rag-code-mcp"]
}
}
}
Problem: âCannot connect to Docker daemonâ
Solution:
# Check RagCode version
~/.local/share/ragcode/bin/rag-code-mcp --version
# Health check
~/.local/share/ragcode/bin/rag-code-mcp --health
# Check Docker containers
docker ps | grep ragcode
# Check Ollama models
ollama list
# Check Qdrant collections
curl http://localhost:6333/collections | jq
# View logs
tail -100 ~/.local/share/ragcode/bin/mcp.log
# Test Ollama connection
curl http://localhost:11434/api/tags
# Test Qdrant connection
curl http://localhost:6333/collections
If your issue isnât listed here:
tail -f ~/.local/share/ragcode/bin/mcp.log