Claude Code Logging with LiteLLM

LiteLLM is a proxy that routes Claude Code requests to Anthropic while capturing detailed logs of all conversations.

Quick Start

  1. LiteLLM runs in Docker with JSON logging enabled
  2. Claude Code points to the proxy
  3. Docker logs capture all requests/responses

1. Create LiteLLM Config

Create a config.yaml file:

model_list:
  - model_name: "anthropic/*"
    litellm_params:
      model: "anthropic/*"

litellm_settings:
  json_logs: true
  log_raw_request_response: true
  master_key: sk-1234

The wildcard pattern routes all Claude models automatically. Master key can be any value.

2. Run LiteLLM in Docker

docker run -d \
  --name litellm \
  -p 4000:4000 \
  -v $(pwd)/config.yaml:/app/config.yaml \
  -e ANTHROPIC_API_KEY=$ANTHROPIC_API_KEY \
  -e JSON_LOGS=True \
  --log-driver json-file \
  ghcr.io/berriai/litellm:main-latest \
  --config /app/config.yaml

3. Configure Claude Code

Set these environment variables before running Claude Code:

export ANTHROPIC_BASE_URL=http://localhost:4000
export ANTHROPIC_AUTH_TOKEN=sk-1234

4. Extract Logs

Export Docker logs to a JSONL file:

docker logs litellm > litellm-logs.jsonl

Viewing in Hyperparam

Open the JSONL file in Hyperparam to analyze conversations, inspect tool calls, and review model responses.

Claude Code Logging with LiteLLM - Hyperparam