Claude Code Logging with LiteLLM
LiteLLM is a proxy that routes Claude Code requests to Anthropic while capturing detailed logs of all conversations.
Quick Start
- LiteLLM runs in Docker with JSON logging enabled
- Claude Code points to the proxy
- Docker logs capture all requests/responses
1. Create LiteLLM Config
Create a config.yaml file:
model_list:
- model_name: "anthropic/*"
litellm_params:
model: "anthropic/*"
litellm_settings:
json_logs: true
log_raw_request_response: true
master_key: sk-1234The wildcard pattern routes all Claude models automatically. Master key can be any value.
2. Run LiteLLM in Docker
docker run -d \
--name litellm \
-p 4000:4000 \
-v $(pwd)/config.yaml:/app/config.yaml \
-e ANTHROPIC_API_KEY=$ANTHROPIC_API_KEY \
-e JSON_LOGS=True \
--log-driver json-file \
ghcr.io/berriai/litellm:main-latest \
--config /app/config.yaml3. Configure Claude Code
Set these environment variables before running Claude Code:
export ANTHROPIC_BASE_URL=http://localhost:4000
export ANTHROPIC_AUTH_TOKEN=sk-12344. Extract Logs
Export Docker logs to a JSONL file:
docker logs litellm > litellm-logs.jsonlViewing in Hyperparam
Open the JSONL file in Hyperparam to analyze conversations, inspect tool calls, and review model responses.