Learning Rate schedules (ReduceLROnPlateau, cosine) support #2224
kirillbobyrev
started this conversation in
General
Replies: 1 comment 1 reply
-
@kirillbobyrev, I opened #2225 which refactors the structure to allow this to be implemented. |
Beta Was this translation helpful? Give feedback.
1 reply
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
Most Deep Learning frameworks support Learning Rate reduction schedulers (cosine, ReduceLROnPLateau etc), fixed schedule, etc. While not very hard to implement manually, out-of-the-box support would definitely be very nice.
Single schedules are not too complicated, but joint schedules and some extra support is definitely useful.
E.g. PyTorch ReduceLROnPlateau or Jax's Optim schedules.
This is the first time I'm seeing and reading Candle's code, so please let me know if there is better channel/place to ask questions like this.
Beta Was this translation helpful? Give feedback.
All reactions