Back to feed

Alishahryar1/free-claude-code

Alishahryar1/free-claude-code
17.3k
+1.4k/day
2.4k
Python

Use claude-code for free in the terminal, VSCode extension or via discord like openclaw

From the README

🤖 Free Claude Code

Use Claude Code CLI & VSCode for free. No Anthropic API key required.

A lightweight proxy that routes Claude Code's Anthropic API calls to NVIDIA NIM (40 req/min free), OpenRouter (hundreds of models), DeepSeek (direct API), LM Studio (fully local), or llama.cpp (local with Anthropic endpoints).

Quick Start · Providers · Discord Bot · Configuration · Development · Contributing

Claude Code running via NVIDIA NIM, completely free

Features

| Feature | Description | | -------------------------- | ----------------------------------------------------------------------------------------------- | | Zero Cost | 40 req/min free on NVIDIA NIM. Free models on OpenRouter. Fully local with LM Studio | | Drop-in Replacement | Set 2 env vars. No modifications to Claude Code CLI or VSCode extension needed | | 5 Providers | NVIDIA NIM, OpenRouter, DeepSeek, LM Studio (local), llama.cpp (llama-server) | | Per-Model Mapping | Route Opus / Sonnet / Haiku to different models and providers. Mix providers freely | | Thinking Token Support | Parses `` tags and reasoning_content into native Claude thinking blocks | | Heuristic Tool Parser | Models outputting tool calls as text are auto-parsed into structured tool use | | Request Optimization | 5 categories of trivial API calls intercepted locally, saving quota and latency | | Smart Rate Limiting | Proactive rolling-window throttle + reactive 429 exponential backoff + optional concurrency cap | | Discord / Telegram Bot | Remote autonomous coding with tree-based threading, session persistence, and live progress | | Subagent Control | Task tool interception forces run_in_background=False. No runaway subagents | | Extensible | Clean BaseProvider and MessagingPlatform ABCs. Add new providers or platforms easily |

Quick Start

Prerequisites

  1. Get an API key (or use LM Studio / llama.cpp locally):
  2. Install [Claude Code](https: