-
Notifications
You must be signed in to change notification settings - Fork 390
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Parameter max_epochs isn't respected from history #674
Comments
Thank you for raising this issue! I agree the current behavior of from sklearn.ensemble import RandomForestRegressor
from sklearn.datasets import make_regression
X, y = make_regression(n_features=4, n_informative=2,
random_state=0, shuffle=False)
regr = RandomForestRegressor(n_estimators=100,
max_depth=2, random_state=0,
warm_start=True)
regr.fit(X, y)
len(regr.estimators_)
# 100
regr.fit(X, y)
# UserWarning: Warm-start fitting without increasing n_estimators does not fit new trees. |
Would changing this behavior break something somewhere else? Lines 714 to 724 in 7a84568
I'm thinking in the lines of making the iteration check against the With that said, the What are your thoughts on this? |
This change is technically possible. Since this is a backward incompatible change, we have to decide if it is worth a deprecation cycle. |
Yes, we have a few divergences here from sklearn. On the one hand, as you said, the epoch counter in the fit loop will always start from 0, even if you've already cycled through a few epochs. Relatedly, calling I think in this case, we should think pragmatically. A few design choices in skorch diverge from sklearn because training a neural net often is a different beast than training a "regular" estimator. Personally, the most common use case I can think of where this behavior matters is when I'm still in an exploratory phase. Training the net for a few epochs, then cancelling with So before making a change and going through an expensive deprecation cycle, as Thomas mentioned, we should collect the most relevant use cases and determine how a change would affect them. In this specific case, I believe ergonomics should trump sklearn compatibility. |
As odd as it might sound to you guys, I'm actually not after sklearn-compatibility here. The idea of having a parameter named as But I really appreciate how you guys managed to get to this issue so fast! I hope you discuss this through and are able to determine what's best for the library w.r.t. this. |
@karmus89 Can you elaborate which problem you thought |
@githubnemo As indicated by earlier discussion,I expected |
I agree that the name can be confusing in this context. I'm open to discuss renaming/changing the implementation, but backwards compatibility is a big factor here. The benefit needs to be big enough compared to the potential cost. |
Looking at the code at
NeuralNet.fit_loop
-function:skorch/skorch/net.py
Line 714 in 7a84568
From what I understand, the
max_epochs
is actually used in a once-in-fit fashion. With early stopping and continued training however what I'd expect would be that the model is trained only formax_epochs
regardless.In other words and my opinion, the training should not proceed further than what has been defined by
max_epochs
(i.e.net.history[-1].epoch
should never exceednet.max_epochs
).The text was updated successfully, but these errors were encountered: