The "free-claude-code" repository provides a versatile proxy solution that enables users to access Claude Code, Anthropic's coding assistant, for free through various interfaces such as the terminal (CLI), VS Code extension, JetBrains ACP, and chat bots (Discord and Telegram). Its primary purpose is to route Claude Code's Anthropic Messages API traffic to a range of alternative model providers, including NVIDIA NIM, OpenRouter, DeepSeek, LM Studio, llama.cpp, and Ollama, allowing users to choose between free, paid, or local models without changing the client-side protocol.
The proxy acts as a drop-in replacement for the Anthropic API, maintaining compatibility with Claude Code's expected endpoints and behaviors. It supports per-model routing, enabling users to send requests for different Claude model tiers (Opus, Sonnet, Haiku) to distinct providers. The proxy exposes the `/v1/models` endpoint, allowing native model selection within Claude Code, and handles advanced features such as streaming responses, tool use, reasoning/thinking block management, and local request optimizations to reduce latency and conserve API quotas.
Configuration is straightforward: users clone the repository, set up Python 3.14 and the required dependencies, copy and edit the `.env` file to specify their chosen provider and model, and launch the proxy server. The proxy can be integrated with Claude Code CLI, VS Code, and JetBrains ACP by setting environment variables to point to the local proxy. The repository provides detailed instructions for configuring each provider, including obtaining API keys and setting base URLs, and supports mixing providers by model tier for maximum flexibility.
Optional integrations include Discord and Telegram bot wrappers, which allow remote coding sessions with streaming progress, conversation branching, and task management. Voice-note transcription is also supported via local Whisper or NVIDIA NIM, enabling voice interactions in chat platforms. The repository offers a comprehensive configuration reference, covering model routing, provider keys and URLs, rate limits, timeouts, security, diagnostics, and web tool settings.
Troubleshooting guidance addresses common issues such as malformed responses, provider errors, and tool call compatibility. The proxy normalizes responses from various providers to match Claude Code's expectations, translating between OpenAI chat streaming and Anthropic SSE where necessary, and optimizing trivial probes locally.
The project is structured with modular components: FastAPI routes and service layers, shared protocol helpers, provider transports and rate limiting, messaging adapters, CLI entry points, configuration management, and tests. Development tools and CI checks ensure code quality, and the repository is designed for extensibility, allowing new providers and messaging platforms to be added with minimal changes.
Contributors are encouraged to report issues, keep changes focused and tested, and follow the project's guidelines. The repository is licensed under MIT, promoting open and collaborative development. Overall, "free-claude-code" empowers users to leverage Claude Code's capabilities across multiple environments and providers, offering a flexible, cost-effective, and feature-rich alternative to direct Anthropic API usage.