Skip to content

feat: EvaluatorLLMArgs删除parameters属性,允许扩展#381

Open
shijinpjlab wants to merge 2 commits intoMigoXLab:devfrom
shijinpjlab:dev_0401_config
Open

feat: EvaluatorLLMArgs删除parameters属性,允许扩展#381
shijinpjlab wants to merge 2 commits intoMigoXLab:devfrom
shijinpjlab:dev_0401_config

Conversation

@shijinpjlab
Copy link
Copy Markdown
Collaborator

No description provided.

Copy link
Copy Markdown
Contributor

@gemini-code-assist gemini-code-assist bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Code Review

This pull request refactors the EvaluatorLLMArgs configuration by removing the explicit parameters dictionary and enabling Pydantic's extra="allow" mode, allowing extra fields to be accessed via model_extra. This change is propagated across various agents, evaluators, and examples to simplify the configuration structure. A critical issue was identified in base_openai.py where passing **extra_params directly to the OpenAI client could cause a TypeError if non-API parameters (such as agent_config or threshold) are present. Additionally, this change removes previous default values for LLM parameters like temperature and max_tokens, which could lead to inconsistent model behavior.

Comment on lines +85 to 92
extra_params = cls.dynamic_config.model_extra
cls.validate_config(extra_params)

completions = cls.client.chat.completions.create(
model=model_name,
messages=messages,
temperature=params.get("temperature", 0.3) if params else 0.3,
top_p=params.get("top_p", 1) if params else 1,
max_tokens=params.get("max_tokens", 4000) if params else 4000,
presence_penalty=params.get("presence_penalty", 0) if params else 0,
frequency_penalty=params.get("frequency_penalty", 0) if params else 0,
**extra_params,
)
Copy link
Copy Markdown
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

high

Using **extra_params directly in chat.completions.create is problematic for two reasons:

  1. TypeError Risk: model_extra contains all additional fields from the configuration, including evaluator-specific settings like threshold, agent_config, min_difficulty, etc. Passing these unknown keyword arguments to the OpenAI client will cause a TypeError and crash the evaluation. This is especially critical for evaluators like LLMInstructionClarity or LLMTaskDifficulty that store their own parameters in the same config object.
  2. Loss of Defaults: This change removes the previous default values for temperature (0.3), top_p (1), and max_tokens (4000). If these are not explicitly provided in the configuration, the OpenAI client will use its own defaults (e.g., temperature=1.0), which may lead to inconsistent or lower-quality results compared to previous versions.

It is safer to explicitly extract the supported LLM parameters with their defaults, ensuring compatibility and maintaining consistent behavior, similar to the implementation in AgentWrapper.get_openai_llm_from_dingo_config.

        extra_params = cls.dynamic_config.model_extra or {}
        cls.validate_config(extra_params)

        completions = cls.client.chat.completions.create(
            model=model_name,
            messages=messages,
            temperature=extra_params.get("temperature", 0.3),
            top_p=extra_params.get("top_p", 1),
            max_tokens=extra_params.get("max_tokens", 4000),
            presence_penalty=extra_params.get("presence_penalty", 0),
            frequency_penalty=extra_params.get("frequency_penalty", 0),
        )

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

1 participant