Skip to content

Conversation

@daniel-lxs
Copy link
Member

@daniel-lxs daniel-lxs commented Jan 29, 2026

Summary

Migrate the Mistral provider from using the direct @mistralai/mistralai SDK to the AI SDK's dedicated @ai-sdk/mistral package.

Changes

1. src/api/providers/mistral.ts

  • Replaced the direct @mistralai/mistralai SDK with @ai-sdk/mistral
  • Uses createMistral to create the provider instance
  • Uses streamText and generateText from the AI SDK v6 for completions
  • Leverages shared AI SDK utilities for consistent message/tool handling
  • Maintains support for Codestral models with custom base URL
  • Follows the same pattern as the Cerebras handler

2. src/package.json

  • Added @ai-sdk/mistral: ^3.0.0 to dependencies (v3.x required for AI SDK v6 compatibility)

3. src/api/providers/__tests__/mistral.spec.ts

  • Complete test suite with 23 tests passing

Testing

  • All unit tests pass (5325 passed)
  • TypeScript compilation successful

Linear

Resolves EXT-692


Important

Migrates Mistral provider to AI SDK, updating dependencies and tests for compatibility.

  • Behavior:
    • Migrates Mistral provider from @mistralai/mistralai to @ai-sdk/mistral in mistral.ts.
    • Uses createMistral, streamText, and generateText for handling completions.
    • Supports Codestral models with custom base URL.
  • Testing:
    • Updates mistral.spec.ts to mock new SDK functions and verify behavior.
    • Ensures all 23 tests pass.
  • Dependencies:
    • Adds @ai-sdk/mistral: ^3.0.0 to package.json for AI SDK v6 compatibility.

This description was created by Ellipsis for 3efb7ec. You can customize this summary. It will automatically update as commits are pushed.

- Replace @mistralai/mistralai SDK with @ai-sdk/mistral v3.0.0
- Use streamText and generateText from AI SDK v6
- Add comprehensive test suite with 23 tests
- Maintain support for Codestral models with custom base URL

EXT-692
@daniel-lxs daniel-lxs requested review from cte, jr and mrubens as code owners January 29, 2026 22:59
@dosubot dosubot bot added size:XXL This PR changes 1000+ lines, ignoring generated files. Enhancement New feature or request labels Jan 29, 2026
@roomote
Copy link
Contributor

roomote bot commented Jan 29, 2026

Rooviewer Clock   See task on Roo Cloud

Review completed. One new issue identified.

  • Default temperature changed from 1 to 0 due to unreachable MISTRAL_DEFAULT_TEMPERATURE fallback (comment)
  • Missing error handling for AI SDK errors in createMessage - should wrap stream processing in try/catch with handleAiSdkError (comment)
Previous reviews

Mention @roomote in a comment to request specific changes to this pull request or fix all unresolved issues.

Address review comment - ensure temperature defaults to 1 (MISTRAL_DEFAULT_TEMPERATURE)
instead of 0 from getModelParams to match original behavior.
@dosubot dosubot bot added the lgtm This PR has been approved by a maintainer label Jan 30, 2026
@mrubens
Copy link
Collaborator

mrubens commented Jan 30, 2026

@roomote please fix the compile and knip errors

@roomote
Copy link
Contributor

roomote bot commented Jan 30, 2026

Coderoo Clock   See task on Roo Cloud

Fixed the reported issues. All local checks passed.

View commit | Revert commit

@mrubens
Copy link
Collaborator

mrubens commented Feb 1, 2026

@roomote can you resolve the conflicts with main one more time?

@roomote
Copy link
Contributor

roomote bot commented Feb 1, 2026

Fixaroo Clock   See task on Roo Cloud

Resolved merge conflicts with main. All local checks passed.

View commit | Revert commit

Comment on lines +161 to +172
// Process the full stream to get all events including reasoning
for await (const part of result.fullStream) {
for (const chunk of processAiSdkStreamPart(part)) {
yield chunk
}
}

return { id, info, maxTokens, temperature }
// Yield usage metrics at the end
const usage = await result.usage
if (usage) {
yield this.processUsageMetrics(usage)
}
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Unlike the other AI SDK providers (Fireworks, Cerebras, DeepSeek, Groq), the Mistral handler is missing error handling for AI SDK errors. The stream processing should be wrapped in a try/catch that uses handleAiSdkError to properly transform errors like AI_RetryError and AI_APICallError into user-friendly messages with preserved status codes. Also add handleAiSdkError to the imports on line 15.

Suggested change
// Process the full stream to get all events including reasoning
for await (const part of result.fullStream) {
for (const chunk of processAiSdkStreamPart(part)) {
yield chunk
}
}
return { id, info, maxTokens, temperature }
// Yield usage metrics at the end
const usage = await result.usage
if (usage) {
yield this.processUsageMetrics(usage)
}
try {
// Process the full stream to get all events including reasoning
for await (const part of result.fullStream) {
for (const chunk of processAiSdkStreamPart(part)) {
yield chunk
}
}
// Yield usage metrics at the end
const usage = await result.usage
if (usage) {
yield this.processUsageMetrics(usage)
}
} catch (error) {
// Handle AI SDK errors (AI_RetryError, AI_APICallError, etc.)
throw handleAiSdkError(error, "Mistral")
}

Fix it with Roo Code or mention @roomote and request a fix.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

Enhancement New feature or request lgtm This PR has been approved by a maintainer size:XXL This PR changes 1000+ lines, ignoring generated files.

Projects

No open projects
Status: Triage

Development

Successfully merging this pull request may close these issues.

4 participants