Giter VIP home page Giter VIP logo

Comments (10)

marcopeix avatar marcopeix commented on August 25, 2024 1

Hello! Can you send a code snippet of OneCycle that works and one for ReduceLROnPlateau to reproduce on my end? Thanks!

from neuralforecast.

BrunoBelucci avatar BrunoBelucci commented on August 25, 2024 1

Hello,

To the best of my knowledge, it is not currently possible to use ReduceLROnPlateau out-of-the-box with the existing implementation of neuralforecast. As mentioned in this comment, this would require the user to provide a "lr_scheduler_config," which is not directly supported at this time. This "lr_scheduler_config" should be returned by the configure_optimizers method in Lightning (refer to this documentation for more details).

In the "lr_scheduler_config," you need to specify the "monitor" parameter with the metric you're tracking to detect the plateau. Typically, the "interval" is also set to "epoch" because the plateau is usually identified per epoch rather than per step, though this may vary depending on your model.

Currently, I am not using neuralforecast with ReduceLROnPlateau, so I cannot offer an immediate solution. However, based on our previous discussion in #998, I believe the simplest workaround for now is to monkey-patch the configure_optimizers function of your model to return a dictionary as expected by Lightning. This dictionary should include an "optimizer" key and, optionally, a "lr_scheduler" key, which can be a single LR scheduler or a lr_scheduler_config (as specified in the documentation).

If you do not manage to solve this, you can provide me a minimum (not) working example and I can see if I can fix it for you during my free time.

Regards,
Bruno

from neuralforecast.

JQGoh avatar JQGoh commented on August 25, 2024 1

@MLfreakPy
The current implementation does not work for ReduceLROnPlateau which requires monitor to be specified.

Nevertheless, you could check out this branch #1015 and install the the dev-version of neuralforecast. This allows you to call set_configure_optimizers function to have a full control of configure_optimizers related settings.
Note that I provided an example of ReduceLROnPlateau in
https://app.reviewnb.com/Nixtla/neuralforecast/pull/1015/

from neuralforecast.

MLfreakPy avatar MLfreakPy commented on August 25, 2024 1

Thank you very much @JQGoh!! I will use this dev-version and your example for implementation ๐Ÿ‘ This flexibility in configuring the optimizers is amazing :)

from neuralforecast.

MLfreakPy avatar MLfreakPy commented on August 25, 2024

Maybe @JQGoh, @jmorales, @BrunoBelucci you have some idea of how to solve it?

from neuralforecast.

MLfreakPy avatar MLfreakPy commented on August 25, 2024

Thank you so much for your explanations and looking into how to resolve the issue!!

Below a code snippet that (a) works with OneCycleLR and (b) that reproduces the error with ReduceLRonPlateau.

# GENERAL + DATA-PROCESSING
!pip install neuralforecast
from neuralforecast import NeuralForecast
from neuralforecast.models import NHITS
from neuralforecast.utils import AirPassengersDF

Y_df = AirPassengersDF # Defined in neuralforecast.utils
Y_df.head()

a. - OneCycleLR

from torch.optim.lr_scheduler import OneCycleLR
      
lr_scheduler_cls = OneCycleLR
max_lr = 1e-2
lr_scheduler_kwargs = {
    'max_lr': max_lr,
    'total_steps': 50}

horizon = 12

models = [ NHITS(h=horizon,                   
                input_size=2 * horizon,     
                max_steps=50,               
                lr_scheduler = lr_scheduler_cls, 
                lr_scheduler_kwargs = lr_scheduler_kwargs) ]
nf = NeuralForecast(models=models, freq='M')
nf.fit(df=Y_df)

b. -ReduceLROnPlateau


from torch.optim.lr_scheduler import ReduceLROnPlateau
lr_scheduler_cls = ReduceLROnPlateau
lr_scheduler_kwargs = {
      'mode': 'min',  
      'factor': 0.5,
      'patience': 2}

horizon = 12

models = [ NHITS(h=horizon,                   
                input_size=2 * horizon,     
                max_steps=50,               
                lr_scheduler = lr_scheduler_cls, 
                lr_scheduler_kwargs = lr_scheduler_kwargs) ]
nf = NeuralForecast(models=models, freq='M')
nf.fit(df=Y_df)

ERROR MESSAGE

/usr/local/lib/python3.10/dist-packages/pytorch_lightning/core/optimizer.py in _configure_schedulers_automatic_opt(schedulers, monitor)
275 )
276 if scheduler["reduce_on_plateau"] and scheduler.get("monitor", None) is None:
--> 277 raise MisconfigurationException(
278 "The lr scheduler dict must include a monitor when a ReduceLROnPlateau scheduler is used."
279 ' For example: {"optimizer": optimizer, "lr_scheduler":'

MisconfigurationException: The lr scheduler dict must include a monitor when a ReduceLROnPlateau scheduler is used. For example: {"optimizer": optimizer, "lr_scheduler": {"scheduler": scheduler, "monitor": "your_loss"}}

from neuralforecast.

MLfreakPy avatar MLfreakPy commented on August 25, 2024

@JQGoh I tried to manipulate the set_configure_optimizers as you described. For that reason I (tried to) install the developer version of nixtla as per below. Maybe I do this wrongly. I cannot see the set_configure_optimizers parameter in the imported neuralforecast models nor use it to implement LR on plateau. Do you have an idea where the problem resides?

# install DEV-version of nixtla (yet not sucessfull?)
!git clone https://github.com/Nixtla/neuralforecast.git
%cd neuralforecast
!pip install -e

#
from torch.optim.lr_scheduler import ReduceLROnPlateau
optimizer = torch.optim.Adam(model.parameters(), lr=learning_rate)
scheduler = torch.optim.lr_scheduler.ReduceLROnPlateau(optimizer = optimizer, mode='min', factor=0.5, patience=2)

model.set_configure_optimizers(optimizer=optimizer,scheduler=scheduler,monitor="train_loss")

from neuralforecast.

JQGoh avatar JQGoh commented on August 25, 2024

@JQGoh I tried to manipulate the set_configure_optimizers as you described. For that reason I (tried to) install the developer version of nixtla as per below. Maybe I do this wrongly. I cannot see the set_configure_optimizers parameter in the imported neuralforecast models nor use it to implement LR on plateau. Do you have an idea where the problem resides?

# install DEV-version of nixtla (yet not sucessfull?)
!git clone https://github.com/Nixtla/neuralforecast.git
%cd neuralforecast
!pip install -e

#
from torch.optim.lr_scheduler import ReduceLROnPlateau
optimizer = torch.optim.Adam(model.parameters(), lr=learning_rate)
scheduler = torch.optim.lr_scheduler.ReduceLROnPlateau(optimizer = optimizer, mode='min', factor=0.5, patience=2)

model.set_configure_optimizers(optimizer=optimizer,scheduler=scheduler,monitor="train_loss")

@MLfreakPy
I just realized that I missed a few details.
That PR was my attempt to support a more complete configurable optimizers, which you will need to clone from my forked repository https://github.com/JQGoh/neuralforecast and checkout the branch feat/modify-config-optimizers.

But do take note that because my forked repository does not always sync with the original repository master, it is better you could copy the key changes as detailed in #1015 to the local copy of your source code, if you want to use the latest updated neuralforecast library.

@jmoralez
FYI, last we spoke about revisiting the PR once we get to work towards v2.0 version. I am re-considering whether that work is still valuable for the community prior to v2.0 version. I admit that the interface/user experience is questionable since we have four parameters (e.g. optimizer) specified for the model and on top of that we could overwrite that behavior via
set_configure_optimizers function. But I shall leave this to the Nixtla team to decide when/how should we support this in the future.

from neuralforecast.

jmoralez avatar jmoralez commented on August 25, 2024

Hey. I think we can introduce that method and deprecate the arguments

from neuralforecast.

MLfreakPy avatar MLfreakPy commented on August 25, 2024

@jmoralez That sounds great๐Ÿ˜Š

from neuralforecast.

Related Issues (20)

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.