deprecation: Is frequency
key necessary in lr_scheduler_config
?
#20714
Labels
frequency
key necessary in lr_scheduler_config
?
#20714
Outline & Motivation
The
lr_scheduler_config
contains afrequency
key as shown below:The frequency of the learning rate decay is already handled by the
torch.optim.lr_scheduler
's. For example, inStepLR
this is handled by thestep_size
parameter as following:meaning that
{..., 'frequency': 2, ...}
andstep_size=2
is equivalent to{..., 'frequency': 1, ...}
andstep_size=4
.Is there any case where the
frequency
key would be beneficial?Pitch
No response
Additional context
No response
cc @lantiga @justusschock
The text was updated successfully, but these errors were encountered: