SkillAgentSearch skills...

Optimizers

For optimization algorithm research and development.

Install / Use

/learn @facebookresearch/Optimizers
About this skill

Quality Score

0/100

Supported Platforms

Universal

README

Optimizers

Python3.12 tests gpu-tests pre-commit type-checking examples license

Copyright (c) Meta Platforms, Inc. and affiliates. All rights reserved.

Description

Optimizers is a Github repository of PyTorch optimization algorithms. It is designed for external collaboration and development.

Currently includes the optimizers:

  • Distributed Shampoo

See the CONTRIBUTING file for how to help out.

License

Optimizers is released under the BSD license.

Installation and Dependencies

Install distributed_shampoo with all dependencies:

git clone git@github.com:facebookresearch/optimizers.git
cd optimizers
pip install .

If you also want to try the examples, replace the last line with pip install ".[examples]".

Usage

After installation, basic usage looks like:

import torch
from distributed_shampoo import AdamPreconditionerConfig, DistributedShampoo

model = ...  # Instantiate model

optim = DistributedShampoo(
    model.parameters(),
    lr=1e-3,
    betas=(0.9, 0.999),
    epsilon=1e-8,
    grafting_config=AdamPreconditionerConfig(
        beta2=0.999,
        epsilon=1e-8,
    ),
)

For more, please see the additional documentation here and especially the How to Use section.

Related Skills

View on GitHub
GitHub Stars561
CategoryDevelopment
Updated5d ago
Forks61

Languages

Python

Security Score

80/100

Audited on Mar 26, 2026

No findings