-
Notifications
You must be signed in to change notification settings - Fork 23
Open
Labels
bugSomething isn't workingSomething isn't working
Description
Describe the bug
Calling optimize_hparams() on MambularClassifier (and similarly on MambularRegressor) raises:
TypeError: mambular.models.utils.sklearn_parent.SklearnBase.fit()
got multiple values for keyword argument 'regression'
The issue occurs because:
SklearnBase.optimize_hparams()callsself.fit(..., regression=...)SklearnBaseClassifier.fit()andSklearnBaseRegressor.fit()do not declare aregressionparameter in their signatures.- They hardcode
regression=False/Truewhen callingsuper().fit(...)and also forward**trainer_kwargs.
This results in regression being passed twice to SklearnBase.fit(), causing the TypeError.
This makes optimize_hparams() unusable for both classifier and regressor models.
To Reproduce
from mambular.models import MambularClassifier
model = MambularClassifier(
d_model=64,
n_layers=4,
)
model.optimize_hparams(
x_train_preprocessed,
y_train,
)Error:
TypeError: mambular.models.utils.sklearn_parent.SklearnBase.fit()
got multiple values for keyword argument 'regression'
The same happens with MambularRegressor.
Expected behavior
optimize_hparams() should run without raising a TypeError.
regression should only be passed once to SklearnBase.fit().
Desktop (please complete the following information):
- OS: Ubuntu (local machine)
- Python version: 3.11
- Torch version: 2.5.1+cu121
- Mambular version: 1.5.0
Additional context
The problem appears to be a mismatch between SklearnBase.optimize_hparams() and the fit() signatures in SklearnBaseClassifier and SklearnBaseRegressor, leading to duplicated keyword arguments.
Reactions are currently unavailable
Metadata
Metadata
Assignees
Labels
bugSomething isn't workingSomething isn't working