Skip to content

deprecation: Is frequency key necessary in lr_scheduler_config? #20714

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Open
adosar opened this issue Apr 13, 2025 · 0 comments
Open

deprecation: Is frequency key necessary in lr_scheduler_config? #20714

adosar opened this issue Apr 13, 2025 · 0 comments
Labels
needs triage Waiting to be triaged by maintainers refactor

Comments

@adosar
Copy link
Contributor

adosar commented Apr 13, 2025

Outline & Motivation

The lr_scheduler_config contains a frequency key as shown below:

lr_scheduler_config = {
    # REQUIRED: The scheduler instance
    "scheduler": lr_scheduler,
    # The unit of the scheduler's step size, could also be 'step'.
    # 'epoch' updates the scheduler on epoch end whereas 'step'
    # updates it after a optimizer update.
    "interval": "epoch",
    # How many epochs/steps should pass between calls to
    # `scheduler.step()`. 1 corresponds to updating the learning
    # rate after every epoch/step.
    "frequency": 1,
    # Metric to monitor for schedulers like `ReduceLROnPlateau`
    "monitor": "val_loss",
    # If set to `True`, will enforce that the value specified 'monitor'
    # is available when the scheduler is updated, thus stopping
    # training if not found. If set to `False`, it will only produce a warning
    "strict": True,
    # If using the `LearningRateMonitor` callback to monitor the
    # learning rate progress, this keyword can be used to specify
    # a custom logged name
    "name": None,
}

The frequency of the learning rate decay is already handled by the torch.optim.lr_scheduler's. For example, in StepLR this is handled by the step_size parameter as following:

# Assuming optimizer uses lr = 0.05 for all groups
# lr = 0.05     if epoch < 30
# lr = 0.005    if 30 <= epoch < 60
# lr = 0.0005   if 60 <= epoch < 90
# ...
scheduler = StepLR(optimizer, step_size=30, gamma=0.1)
for epoch in range(100):
    train(...)
    validate(...)
    scheduler.step()

meaning that {..., 'frequency': 2, ...} and step_size=2 is equivalent to {..., 'frequency': 1, ...} and step_size=4.

Is there any case where the frequency key would be beneficial?

Pitch

No response

Additional context

No response

cc @lantiga @justusschock

@adosar adosar added needs triage Waiting to be triaged by maintainers refactor labels Apr 13, 2025
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
needs triage Waiting to be triaged by maintainers refactor
Projects
None yet
Development

No branches or pull requests

1 participant