Skip to content
Open
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
44 changes: 6 additions & 38 deletions .env.example
Original file line number Diff line number Diff line change
@@ -1,40 +1,8 @@
# AI API Keys
CLAUDE_CODE_KEY=dummy_claude_code_key
OPEN_ROUTER_API_KEY=dummy_openrouter_key
# Base URL for any OpenAI-compatible service (e.g. LiteLLM proxy, OpenRouter, Azure OpenAI, etc.)
LLM_BASE_URL=http://127.0.0.1:4000

# Database & Server
DATABASE_URL=postgresql://manicode_user_local:secretpassword_local@localhost:5432/manicode_db_local
PORT=4242
GOOGLE_CLOUD_PROJECT_ID=dummy_project_id
# API key for the service. For LiteLLM proxy you can use any string (e.g. sk-test).
LLM_API_KEY=sk-1234

# Authentication
CODEBUFF_GITHUB_ID=dummy_github_id
CODEBUFF_GITHUB_SECRET=dummy_github_secret
NEXTAUTH_SECRET=dummy_nextauth_secret_at_least_32_chars_long
API_KEY_ENCRYPTION_SECRET=dummy_encryption_secret_32_chars

# Payment (Stripe)
STRIPE_SECRET_KEY=sk_test_dummy_stripe_secret
STRIPE_WEBHOOK_SECRET_KEY=whsec_dummy_webhook_secret
STRIPE_USAGE_PRICE_ID=price_dummy_usage_id
STRIPE_TEAM_FEE_PRICE_ID=price_dummy_team_fee_id

# External Services
RELACE_API_KEY=dummy_relace_key
LINKUP_API_KEY=dummy_linkup_key
LOOPS_API_KEY=dummy_loops_key

# Discord Integration
DISCORD_PUBLIC_KEY=dummy_discord_public_key
DISCORD_BOT_TOKEN=dummy_discord_bot_token
DISCORD_APPLICATION_ID=dummy_discord_app_id

# Frontend/Public Variables
NEXT_PUBLIC_CB_ENVIRONMENT=dev
NEXT_PUBLIC_CODEBUFF_APP_URL=http://localhost:3000
NEXT_PUBLIC_CODEBUFF_BACKEND_URL=localhost:4242
[email protected]
NEXT_PUBLIC_POSTHOG_API_KEY=phc_dummy_posthog_key
NEXT_PUBLIC_POSTHOG_HOST_URL=https://us.i.posthog.com
NEXT_PUBLIC_STRIPE_PUBLISHABLE_KEY=pk_test_dummy_publishable
NEXT_PUBLIC_STRIPE_CUSTOMER_PORTAL=https://billing.stripe.com/p/login/test_dummy
# Model ID exposed by your service (must match what /v1/models returns).
LLM_MODEL=gpt-5-chat-latest
12 changes: 12 additions & 0 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -142,6 +142,18 @@ Learn more about the SDK [here](https://www.npmjs.com/package/@codebuff/sdk).

**Fully customizable SDK**: Build Codebuff's capabilities directly into your applications with a complete TypeScript SDK. Create custom tools, integrate with your CI/CD pipeline, build AI-powered development environments, or embed intelligent coding assistance into your products.

## OpenAI-Compatible Endpoints

Codebuff can talk to any OpenAI-compatible proxy such as LiteLLM, OpenRouter, or Ollama with their OpenAI bridge. Configure the target endpoint with these environment variables (see [.env.example](./.env.example)):

```bash
LLM_BASE_URL=http://127.0.0.1:4000
LLM_API_KEY=sk-... # token provided by your proxy
LLM_MODEL=gpt-4o-mini # or the route/model exposed by the proxy
```

Point the variables at a new provider and restart the server/CLI to switch models—no code changes required.

## Contributing to Codebuff

We ❤️ contributions from the community - whether you're fixing bugs, tweaking our agents, or improving documentation.
Expand Down
1 change: 1 addition & 0 deletions backend/package.json
Original file line number Diff line number Diff line change
Expand Up @@ -40,6 +40,7 @@
"gpt-tokenizer": "2.8.1",
"ignore": "5.3.2",
"lodash": "*",
"node-fetch": "^3.3.2",
"openai": "^4.78.1",
"pino": "9.4.0",
"postgres": "3.4.4",
Expand Down
1 change: 1 addition & 0 deletions backend/src/index.ts
Original file line number Diff line number Diff line change
@@ -1,3 +1,4 @@
import 'dotenv/config'
import http from 'http'

import { setupBigQuery } from '@codebuff/bigquery'
Expand Down
Loading