Skip to content

fix(llm): use non-streaming Gemini Pro requests#45

Merged
1bcMax merged 1 commit intoBlockRunAI:mainfrom
0xCheetah1:fix/gemini-thinking-budget-name
May 7, 2026
Merged

fix(llm): use non-streaming Gemini Pro requests#45
1bcMax merged 1 commit intoBlockRunAI:mainfrom
0xCheetah1:fix/gemini-thinking-budget-name

Conversation

@0xCheetah1
Copy link
Copy Markdown
Contributor

Summary

  • Force Gemini Pro reasoning models through the non-streaming /v1/messages path so their thinking budget is preserved by the gateway.
  • Convert non-streaming Gemini responses back into Franklin's internal stream chunks so the rest of the agent loop remains unchanged.

Verification

  • npm run build
  • node dist/index.js --model google/gemini-2.5-pro --prompt "say ok"
  • node dist/index.js --model google/gemini-3.1-pro --prompt "say ok"
  • node dist/index.js --resume session-2026-05-07T10-51-11-210Z-fe4ab9e0 --prompt "hey"

@1bcMax 1bcMax merged commit 8b349fd into BlockRunAI:main May 7, 2026
2 checks passed
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

1 participant