SkillAgentSearch skills...

Meft

Source code for the paper "Memory-Efficient Fine-Tuning via Low-Rank Activation Compression"

Install / Use

/learn @shijxcs/Meft
About this skill

Quality Score

0/100

Supported Platforms

Universal

README

Memory-Efficient Fine-Tuning via Low-Rank Activation Compression

Usage

Simply replace the original Trainer with MeftTrainer and add the configuration:

from meft import MeftConfig, MeftTrainer

trainer = MeftTrainer(
    ...,
    meft_config=MeftConfig(...),
)

For trainer variants (e.g. SFTTrainer), use the subscript syntax:

from datasets import load_dataset
from meft import MeftConfig, MeftTrainer
from trl import SFTTrainer

dataset = load_dataset("Salesforce/wikitext", "wikitext-2-v1", split="train[:1%]")

trainer = MeftTrainer[SFTTrainer](
    model="Qwen/Qwen3-0.6B-Base",
    train_dataset=dataset,
    meft_config=MeftConfig(
        patch_locations="layer",
        compress_kwargs={"rank": 128},
    ),
)
trainer.train()

Please refer to config.py for detailed configurations.

View on GitHub
GitHub Stars14
CategoryDevelopment
Updated19d ago
Forks0

Languages

Python

Security Score

75/100

Audited on Mar 9, 2026

No findings