Skip to content

Conversation

@richrz
Copy link
Owner

@richrz richrz commented Sep 24, 2025

Summary

Adds an OpenAI-compatible LLM adapter to Codebuff.

Key changes

  • Drop node-fetch; use native fetch (Node 18+).
  • Add timeout + retries with backoff.
  • Support Azure api-version + OpenRouter headers.
  • Pass through tools/tool_choice.
  • Harden SSE streaming parser.
  • Add smoke:llm script.
  • Add .env.example.

How to test

  1. Copy .env.example.env and fill values.
  2. Build:
    pnpm build

    or

    npx sucrase ./backend/src/llm -d ./backend/tmpdist/llm --transforms typescript
  3. Run:
    pnpm smoke:llm
  4. Expected output: "Hi! 👋 How’s your day going?"

Notes

Works with LiteLLM, OpenRouter, Azure OpenAI, or any OpenAI-compatible API.

@richrz richrz marked this pull request as ready for review September 24, 2025 22:00
@richrz richrz changed the title Add OpenAI-compatible adapter and pnpm build support This PR adds OpenAI-compatible LLM support to Codebuff. Sep 24, 2025
@richrz richrz changed the title This PR adds OpenAI-compatible LLM support to Codebuff. feat: add OpenAI-compatible LLM support (LiteLLM, retries, smoke test) Sep 24, 2025
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants