Skip to content

facebookresearch/optimizers

Repository files navigation

Optimizers

Copyright (c) Meta Platforms, Inc. and affiliates. All rights reserved.

Description

Optimizers is a Github repository of PyTorch optimization algorithms. It is designed for external collaboration and development.

Currently includes the optimizers:

  • Distributed Shampoo

See the CONTRIBUTING file for how to help out.

License

Optimizers is BSD licensed, as found in the LICENSE file.

Installation and Dependencies

This code requires python>=3.10 and (as of 18 July 2024) requires the PyTorch nightly build. Please follow the instructions here to install PyTorch. Installing PyTorch with CUDA and NCCL support is required.

Install distributed_shampoo:

git clone [email protected]:facebookresearch/optimizers.git
cd optimizers
pip install -e .

Usage

After installation, basic usage looks like:

import torch
from distributed_shampoo.distributed_shampoo import DistributedShampoo
from distributed_shampoo.shampoo_types import AdamGraftingConfig

model = ...  # Instantiate model

optim = DistributedShampoo(
    model.parameters(),
    lr=1e-3,
    betas=(0.9, 0.999),
    epsilon=1e-8,
    grafting_config=AdamGraftingConfig(
        beta2=0.999,
        epsilon=1e-8,
    ),
)

For more, please see the additional documentation here and especially the How to Use section.

About

For optimization algorithm research and development.

Resources

License

Code of conduct

Security policy

Stars

Watchers

Forks

Packages

No packages published

Languages