-
Notifications
You must be signed in to change notification settings - Fork 36
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Benchmarking algorithms of various families #68
Comments
This can be inspired by the trivial example https://github.com/openopt/chop/blob/master/examples/optim_dynamics.py |
It would also be really nice to benchmark algorithms on nonconvex problems. Do the usual practical acceleration methods (backtracking line search) work here? |
backtracking does work (in theory and practice) on non-convex problems |
I meant does it work for accelerating convergence in practice on non-convex problems? |
Yes
…On Tue, Nov 24, 2020, 21:14 Geoffrey Negiar ***@***.***> wrote:
I meant does it work for accelerating convergence in practice on
non-convex problems?
—
You are receiving this because you commented.
Reply to this email directly, view it on GitHub
<#68 (comment)>, or
unsubscribe
<https://github.com/notifications/unsubscribe-auth/AACDZB4VGE3G7O74IZECN7DSRRSA3ANCNFSM4OCDSFLQ>
.
|
(as long as we're talking about deterministic algorithms, the top comment makes it confusing) |
It would be nice to have an example to compare speed of convergence of SAGA/SVRG/SFW on problems attaining the same optimum.
The text was updated successfully, but these errors were encountered: