SkillAgentSearch skills...

Adapters

A Unified Library for Parameter-Efficient and Modular Transfer Learning

Install / Use

/learn @adapter-hub/Adapters

README

<!--- Copyright 2020 The AdapterHub Team. All rights reserved. Licensed under the Apache License, Version 2.0 (the "License"); you may not use this file except in compliance with the License. You may obtain a copy of the License at http://www.apache.org/licenses/LICENSE-2.0 Unless required by applicable law or agreed to in writing, software distributed under the License is distributed on an "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the License for the specific language governing permissions and limitations under the License. --> <p align="center"> <img style="vertical-align:middle" src="https://raw.githubusercontent.com/Adapter-Hub/adapters/main/docs/img/adapter-bert.png" width="80" /> </p> <h1 align="center"> <span><i>Adapters</i></span> </h1> <h3 align="center"> A Unified Library for Parameter-Efficient and Modular Transfer Learning </h3> <h3 align="center"> <a href="https://adapterhub.ml">Website</a> &nbsp; • &nbsp; <a href="https://docs.adapterhub.ml">Documentation</a> &nbsp; • &nbsp; <a href="https://arxiv.org/abs/2311.11077">Paper</a> </h3>

Tests GitHub PyPI

Adapters is an add-on library to HuggingFace's Transformers, integrating 10+ adapter methods into 20+ state-of-the-art Transformer models with minimal coding overhead for training and inference.

Adapters provides a unified interface for efficient fine-tuning and modular transfer learning, supporting a myriad of features like full-precision or quantized training (e.g. Q-LoRA, Q-Bottleneck Adapters, or Q-PrefixTuning), adapter merging via task arithmetics or the composition of multiple adapters via composition blocks, allowing advanced research in parameter-efficient transfer learning for NLP tasks.

Note: The Adapters library has replaced the adapter-transformers package. All previously trained adapters are compatible with the new library. For transitioning, please read: https://docs.adapterhub.ml/transitioning.html.

Installation

adapters currently supports Python 3.9+ and PyTorch 2.0+. After installing PyTorch, you can install adapters from PyPI ...

pip install -U adapters

... or from source by cloning the repository:

git clone https://github.com/adapter-hub/adapters.git
cd adapters
pip install .

Quick Tour

Load pre-trained adapters:

from adapters import AutoAdapterModel
from transformers import AutoTokenizer

model = AutoAdapterModel.from_pretrained("roberta-base")
tokenizer = AutoTokenizer.from_pretrained("roberta-base")

model.load_adapter("AdapterHub/roberta-base-pf-imdb", source="hf", set_active=True)

print(model(**tokenizer("This works great!", return_tensors="pt")).logits)

Learn More

Adapt existing model setups:

import adapters
from transformers import AutoModelForSequenceClassification

model = AutoModelForSequenceClassification.from_pretrained("t5-base")

adapters.init(model)

model.add_adapter("my_lora_adapter", config="lora")
model.train_adapter("my_lora_adapter")

# Your regular training loop...

Learn More

Flexibly configure adapters:

from adapters import ConfigUnion, PrefixTuningConfig, ParBnConfig, AutoAdapterModel

model = AutoAdapterModel.from_pretrained("microsoft/deberta-v3-base")

adapter_config = ConfigUnion(
    PrefixTuningConfig(prefix_length=20),
    ParBnConfig(reduction_factor=4),
)
model.add_adapter("my_adapter", config=adapter_config, set_active=True)

Learn More

Easily compose adapters in a single model:

from adapters import AdapterSetup, AutoAdapterModel
import adapters.composition as ac

model = AutoAdapterModel.from_pretrained("roberta-base")

qc = model.load_adapter("AdapterHub/roberta-base-pf-trec")
sent = model.load_adapter("AdapterHub/roberta-base-pf-imdb")

with AdapterSetup(ac.Parallel(qc, sent)):
    print(model(**tokenizer("What is AdapterHub?", return_tensors="pt")))

Learn More

Useful Resources

HuggingFace's great documentation on getting started with Transformers can be found here. adapters is fully compatible with Transformers.

To get started with adapters, refer to these locations:

  • Colab notebook tutorials, a series notebooks providing an introduction to all the main concepts of (adapter-)transformers and AdapterHub
  • https://docs.adapterhub.ml, our documentation on training and using adapters with adapters
  • https://adapterhub.ml to explore available pre-trained adapter modules and share your own adapters
  • Examples folder of this repository containing HuggingFace's example training scripts, many adapted for training adapters

Implemented Methods

Currently, adapters integrates all architectures and methods listed below:

| Method | Paper(s) | Quick Links | | --- | --- | --- | | Bottleneck adapters | Houlsby et al. (2019)<br> Bapna and Firat (2019)<br> Steitz and Roth (2024) | Quickstart, Notebook | | AdapterFusion | Pfeiffer et al. (2021) | Docs: Training, Notebook | | MAD-X,<br> Invertible adapters | Pfeiffer et al. (2020) | Notebook | | AdapterDrop | Rücklé et al. (2021) | Notebook | | MAD-X 2.0,<br> Embedding training | Pfeiffer et al. (2021) | Docs: Embeddings, Notebook | | Prefix Tuning | Li and Liang (2021) | Docs | | Parallel adapters,<br> Mix-and-Match adapters | He et al. (2021) | Docs | | Compacter | Mahabadi et al. (2021) | Docs | | LoRA | Hu et al. (2021) | Docs | | MTL-LoRA | Yang et al., 2024 | Docs | | (IA)^3 | Liu et al. (2022) | Docs | | Vera | Kopiczko et al., 2024 | Docs | DoRA | Liu et al., 2024 | Docs | UniPELT | Mao et al. (2022) | Docs | | Prompt Tuning | Lester et al. (2021) | Docs | | QLoRA | Dettmers et al. (2023) | Notebook | | ReFT | Wu et al. (2024) | Docs | | Adapter Task Arithmetics | Chronopoulou et al. (2023)<br> Zhang et al. (2023) | Docs, Notebook|

Supported Models

We currently support the PyTorch versions of all models listed on the Model Overview page in our documentation.

Developing & Contributing

To get started with developing on Adapters yourself and learn more about ways to contribute, please see https://docs.adapterhub.ml/contributing.html.

Citation

If you use _Adapters

View on GitHub
GitHub Stars2.8k
CategoryEducation
Updated2d ago
Forks373

Languages

Python

Security Score

100/100

Audited on Apr 4, 2026

No findings