Giter VIP home page Giter VIP logo

hparams's Introduction

Extensible and Fault-Tolerant Hyperparameter Management

HParams is a thoughtful approach to configuration management for machine learning projects. It enables you to externalize your hyperparameters into a configuration file. In doing so, you can reproduce experiments, iterate quickly, and reduce errors.

Features:

  • Approachable and easy-to-use API
  • Battle-tested over three years
  • Fast with little to no runtime overhead (< 3e-05 seconds) per configured function
  • Robust to most use cases with 100% test coverage and 75 tests
  • Lightweight with only one dependency

PyPI - Python Version Codecov Downloads Build Status License: MIT Twitter: PetrochukM

Logo by Chloe Yeo, Corporate Sponsorship by WellSaid Labs

Installation

Make sure you have Python 3. You can then install hparams using pip:

pip install hparams

Install the latest code via:

pip install git+https://github.com/PetrochukM/HParams.git

Oops ๐Ÿ›

With HParams, you will avoid common but needless hyperparameter mistakes. It will throw a warning or error if:

  • A hyperparameter is overwritten.
  • A hyperparameter is declared but not set.
  • A hyperparameter is set but not declared.
  • A hyperparameter type is incorrect.

Finally, HParams is built with developer experience in mind. HParams includes 13 errors and 6 warnings to help catch and resolve issues quickly.

Examples

Add HParams to your project by following one of these common use cases:

Configure Training ๐Ÿค—

Configure your training run, like so:

# main.py
from hparams import configurable, add_config, HParams, HParam
from typing import Union

@configurable
def train(batch_size: Union[int, HParam]=HParam(int)):
    pass

class Model():

    @configurable
    def __init__(self, hidden_size=HParam(int), dropout=HParam(float)):
        pass

add_config({ 'main': {
    'train': HParams(batch_size=32),
    'Model.__init__': HParams(hidden_size=1024, dropout=0.25),
}})

HParams supports optional configuration typechecking to help you find bugs! ๐Ÿ›

Set Defaults

Configure PyTorch and Tensorflow defaults to match via:

from torch.nn import BatchNorm1d
from hparams import configurable, add_config, HParams

# NOTE: `momentum=0.01` to match Tensorflow defaults
BatchNorm1d.__init__ = configurable(BatchNorm1d.__init__)
add_config({ 'torch.nn.BatchNorm1d.__init__': HParams(momentum=0.01) })

Configure your random seed globally, like so:

# config.py
import random
from hparams import configurable, add_config, HParams

random.seed = configurable(random.seed)
add_config({'random.seed': HParams(a=123)})
# main.py
import config
import random

random.seed()

CLI

Experiment with hyperparameters through your command line, for example:

foo@bar:~$ file.py --torch.optim.adam.Adam.__init__ 'HParams(lr=0.1,betas=(0.999,0.99))'
import sys
from torch.optim import Adam
from hparams import configurable, add_config, parse_hparam_args

Adam.__init__ = configurable(Adam.__init__)
parsed = parse_hparam_args(sys.argv[1:])  # Parse command line arguments
add_config(parsed)

Hyperparameter optimization

Hyperparameter optimization is easy to-do, check this out:

import itertools
from torch.optim import Adam
from hparams import configurable, add_config, HParams

Adam.__init__ = configurable(Adam.__init__)

def train():  # Train the model and return the loss.
    pass

for betas in itertools.product([0.999, 0.99, 0.9], [0.999, 0.99, 0.9]):
    add_config({Adam.__init__: HParams(betas=betas)})  # Grid search over the `betas`
    train()

Track Hyperparameters

Easily track your hyperparameters using tools like Comet.

from comet_ml import Experiment
from hparams import get_config

experiment = Experiment()
experiment.log_parameters(get_config())

Multiprocessing: Partial Support

Export a Python functools.partial to use in another process, like so:

from hparams import configurable, HParam

@configurable
def func(hparam=HParam()):
    pass

partial = func.get_configured_partial()

With this approach, you don't have to transfer the global state to the new process. To transfer the global state, you'll want to use get_config and add_config.

Docs ๐Ÿ“–

The complete documentation for HParams is available here.

Contributing

We've released HParams because a lack of hyperparameter management solutions. We hope that other people can benefit from the project. We are thankful for any contributions from the community.

Contributing Guide

Read our contributing guide to learn about our development process, how to propose bugfixes and improvements, and how to build and test your changes to HParams.

Authors

Citing

If you find HParams useful for an academic publication, then please use the following BibTeX to cite it:

@misc{hparams,
author = {Petrochuk, Michael},
title = {HParams: Hyperparameter management solution},
year = {2019},
publisher = {GitHub},
journal = {GitHub repository},
howpublished = {\url{https://github.com/PetrochukM/HParams}},
}

hparams's People

Contributors

petrochukm avatar

Watchers

James Cloos avatar

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.