feat: EvaluatorLLMArgs删除parameters属性,允许扩展#381
feat: EvaluatorLLMArgs删除parameters属性,允许扩展#381shijinpjlab wants to merge 2 commits intoMigoXLab:devfrom
Conversation
There was a problem hiding this comment.
Code Review
This pull request refactors the EvaluatorLLMArgs configuration by removing the explicit parameters dictionary and enabling Pydantic's extra="allow" mode, allowing extra fields to be accessed via model_extra. This change is propagated across various agents, evaluators, and examples to simplify the configuration structure. A critical issue was identified in base_openai.py where passing **extra_params directly to the OpenAI client could cause a TypeError if non-API parameters (such as agent_config or threshold) are present. Additionally, this change removes previous default values for LLM parameters like temperature and max_tokens, which could lead to inconsistent model behavior.
| extra_params = cls.dynamic_config.model_extra | ||
| cls.validate_config(extra_params) | ||
|
|
||
| completions = cls.client.chat.completions.create( | ||
| model=model_name, | ||
| messages=messages, | ||
| temperature=params.get("temperature", 0.3) if params else 0.3, | ||
| top_p=params.get("top_p", 1) if params else 1, | ||
| max_tokens=params.get("max_tokens", 4000) if params else 4000, | ||
| presence_penalty=params.get("presence_penalty", 0) if params else 0, | ||
| frequency_penalty=params.get("frequency_penalty", 0) if params else 0, | ||
| **extra_params, | ||
| ) |
There was a problem hiding this comment.
Using **extra_params directly in chat.completions.create is problematic for two reasons:
- TypeError Risk:
model_extracontains all additional fields from the configuration, including evaluator-specific settings likethreshold,agent_config,min_difficulty, etc. Passing these unknown keyword arguments to the OpenAI client will cause aTypeErrorand crash the evaluation. This is especially critical for evaluators likeLLMInstructionClarityorLLMTaskDifficultythat store their own parameters in the same config object. - Loss of Defaults: This change removes the previous default values for
temperature(0.3),top_p(1), andmax_tokens(4000). If these are not explicitly provided in the configuration, the OpenAI client will use its own defaults (e.g.,temperature=1.0), which may lead to inconsistent or lower-quality results compared to previous versions.
It is safer to explicitly extract the supported LLM parameters with their defaults, ensuring compatibility and maintaining consistent behavior, similar to the implementation in AgentWrapper.get_openai_llm_from_dingo_config.
extra_params = cls.dynamic_config.model_extra or {}
cls.validate_config(extra_params)
completions = cls.client.chat.completions.create(
model=model_name,
messages=messages,
temperature=extra_params.get("temperature", 0.3),
top_p=extra_params.get("top_p", 1),
max_tokens=extra_params.get("max_tokens", 4000),
presence_penalty=extra_params.get("presence_penalty", 0),
frequency_penalty=extra_params.get("frequency_penalty", 0),
)
No description provided.