Optimizers
For optimization algorithm research and development.
Install / Use
/learn @facebookresearch/OptimizersREADME
Optimizers
Copyright (c) Meta Platforms, Inc. and affiliates. All rights reserved.
Description
Optimizers is a Github repository of PyTorch optimization algorithms. It is designed for external collaboration and development.
Currently includes the optimizers:
- Distributed Shampoo
See the CONTRIBUTING file for how to help out.
License
Optimizers is released under the BSD license.
Installation and Dependencies
Install distributed_shampoo with all dependencies:
git clone git@github.com:facebookresearch/optimizers.git
cd optimizers
pip install .
If you also want to try the examples, replace the last line with pip install ".[examples]".
Usage
After installation, basic usage looks like:
import torch
from distributed_shampoo import AdamPreconditionerConfig, DistributedShampoo
model = ... # Instantiate model
optim = DistributedShampoo(
model.parameters(),
lr=1e-3,
betas=(0.9, 0.999),
epsilon=1e-8,
grafting_config=AdamPreconditionerConfig(
beta2=0.999,
epsilon=1e-8,
),
)
For more, please see the additional documentation here and especially the How to Use section.
Related Skills
node-connect
343.1kDiagnose OpenClaw node connection and pairing failures for Android, iOS, and macOS companion apps
frontend-design
90.0kCreate distinctive, production-grade frontend interfaces with high design quality. Use this skill when the user asks to build web components, pages, or applications. Generates creative, polished code that avoids generic AI aesthetics.
openai-whisper-api
343.1kTranscribe audio via OpenAI Audio Transcriptions API (Whisper).
qqbot-media
343.1kQQBot 富媒体收发能力。使用 <qqmedia> 标签,系统根据文件扩展名自动识别类型(图片/语音/视频/文件)。
