Pygda
PyGDA is a Python library for Graph Domain Adaptation
Install / Use
/learn @pygda-team/PygdaREADME
PyGDA is a Python library for Graph Domain Adaptation built upon PyTorch and PyG to easily train graph domain adaptation models in a sklearn style. PyGDA includes 20+ graph domain adaptation models. See examples with PyGDA below!
Graph Domain Adaptation Using PyGDA with 5 Lines of Code
from pygda.models import A2GNN
# choose a graph domain adaptation model
model = A2GNN(in_dim=num_features, hid_dim=args.nhid, num_classes=num_classes, device=args.device)
# train the model
model.fit(source_data, target_data)
# evaluate the performance
logits, labels = model.predict(target_data)
PyGDA is featured for:
- Consistent APIs and comprehensive documentation.
- Cover 20+ graph domain adaptation models.
- Scalable architecture that efficiently handles large graph datasets through mini-batching and sampling techniques.
- Seamlessly integrated data processing with PyG, ensuring full compatibility with PyG data structures.
:loudspeaker: What's New?
[08/2025]. We have added support for very recent graph domain adaptation models.
- 2 recent models including
TDSSandDGSDAare supported.
[03/2025]. We now support multi-source-free setting of graph domain adaptation.
- To perform a multi-source-free domain adaptation task, simply modify one parameter in the model as follows:
model = GraphATA(in_dim=num_features, hid_dim=args.nhid, num_classes=num_classes, num_src_domains=n, device=args.device)
model.fit([source_data, source_data2, ...], target_data)
[12/2024]. We now support source-free setting of graph domain adaptation.
- 3 recent models including
GTrans,SOGAandGraphCTAare supported.
[08/2024]. We support graph-level domain adaptation task.
- 7 models including
A2GNN,AdaGCN,CWGCN,DANE,GRADE,SAGDA,UDAGCNare supported. - Various TUDatasets are supported including
FRANKENSTEIN,MutagenicityandPROTEINS. - To perform a graph-level domain adaptation task, only one parameter is added to the model as follows:
model = A2GNN(in_dim=num_features, hid_dim=args.nhid, num_classes=num_classes, mode='graph', device=args.device)
Installation
Note: PyGDA depends on PyTorch, PyG, PyTorch Sparse and Pytorch Scatter. PyGDA does not automatically install these libraries for you. Please install them separately in order to run PyGDA successfully.
Required Dependencies:
- torch>=1.13.1
- torch_geometric>=2.4.0
- torch_sparse>=0.6.15
- torch_scatter>=2.1.0
- python3
- scipy
- sklearn
- numpy
- cvxpy
- tqdm
Installing with pip:
pip install pygda
or
Installation for local development:
git clone https://github.com/pygda-team/pygda
cd pygda
pip install -e .
Quick Start
Step 1: Load Data
from pygda.datasets import CitationDataset
source_dataset = CitationDataset(path, args.source)
target_dataset = CitationDataset(path, args.target)
Step 2: Build Model
from pygda.models import A2GNN
model = A2GNN(in_dim=num_features, hid_dim=args.nhid, num_classes=num_classes, device=args.device)
Step 3: Fit Model
model.fit(source_data, target_data)
Step 4: Evaluation
from pygda.metrics import eval_micro_f1, eval_macro_f1
logits, labels = model.predict(target_data)
preds = logits.argmax(dim=1)
mi_f1 = eval_micro_f1(labels, preds)
ma_f1 = eval_macro_f1(labels, preds)
Create your own GDA model
In addition to the easy application of existing GDA models, PyGDA makes it simple to implement custom models.
- the customed model should inherit
BaseGDAclass. - implement your
fit(),forward_model(), andpredict()functions.
Reference
| ID | Paper | Method | Venue |
|--------|---------|:----------:|:--------------:|
| 01 | Semi-Supervised Classification with Graph Convolutional Networks | Vanilla GCN | ICLR 2017 |
| 02 | DANE: Domain Adaptive Network Embedding | DANE | IJCAI 2019 |
| 03 | Adversarial Deep Network Embedding for Cross-network Node Classification | ACDNE | AAAI 2020 |
| 04 | Unsupervised Domain Adaptive Graph Convolutional Networks | UDAGCN | WWW 2020 |
| 05 | Adversarial Separation Network for Cross-Network Node Classification | ASN | CIKM 2021 |
| 06 | Graph Transfer Learning via Adversarial Domain Adaptation with Graph Convolution | AdaGCN | TKDE 2022 |
| 07 | Non-IID Transfer Learning on Graphs | GRADE | AAAI 2023 |
| 08 | Graph Domain Adaptation via Theory-Grounded Spectral Regularization | SpecReg | ICLR 2023 |
| 09 | Structural Re-weighting Improves Graph Domain Adaptation | StruRW | ICML 2023 |
| 10 | Improving Graph Domain Adaptation with Network Hierarchy | JHGDA | CIKM 2023 |
| 11 | Bridged-GNN: Knowledge Bridge Learning for Effective Knowledge Transfer | KBL | CIKM 2023 |
| 12 | Domain-adaptive Message Passing Graph Neural Network | DMGNN | NN 2023 |
| 13 | Correntropy-Induced Wasserstein GCN: Learning Graph Embedding via Domain Adaptation | CWGCN | TIP 2023 |
| 14 | SA-GDA: Spectral Augmentation for Graph Domain Adaptation | SAGDA | MM 2023 |
| 15 | Empowering Graph Representation Learning with Test-Time Graph Transformation | GTrans | ICLR 2023 |
| 16 | Graph Domain Adaptation: A Generative View | DGDA | TKDD 2024 |
| 17 | Rethinking Propagation for Unsupervised Graph Domain Adaptation | A2GNN | AAAI 2024 |
| 18 | Pairwise Alignment Improves Graph Domain Adaptation | PairAlign | ICML 2024 |
| 19 | Structure Enhanced Prototypical Alignment for Unsupervised Cross-Domain Node Classification | SEPA | NN 2024 |
| 20 | Source Free Unsupervised Graph Domain Adaptation | SOGA | WSDM 2024 |
| 21 | Collaborate to Adapt: Source-Free Graph Domain Adaptation via Bi-directional Adaptation | GraphCTA | WWW 2024 |
| 22 | Smoothness Really Matters: A Simple Yet Effective Approach for Unsupervised Graph Domain Adaptation | TDSS | AAAI 2025 |
| 23 | Aggregate to Adapt: Node-Centric Aggregation for Multi-Source-Free Graph Domain Adaptation | GraphATA | WWW 2025 |
| 24 | Disentangled Graph Spectral Domain Adaptation | DGSDA | ICML 2025 |
Cite
If you compare with, build on, or use aspects of PyGDA, please consider citing "Revisiting, Benchmarking and Understanding Unsupervised Graph Domain Adaptation":
@inproceedings{liu2024revisiting,
title={Revisiting, Benchmarking and Understanding Unsupervised Graph Domain Adaptation},
author={Meihan Liu and Zhen Zhang and Jiachen Tang and Jiajun Bu and Bingsheng He and Sheng Zhou},
booktitle={The Thirty-eight Conference on Neural Information Processing Systems Datasets and Benchmarks Track},
year={2024},
url={https://openreview.net/forum?id=ZsyFwzuDzD}
}
Related Skills
node-connect
351.4kDiagnose OpenClaw node connection and pairing failures for Android, iOS, and macOS companion apps
frontend-design
110.7kCreate distinctive, production-grade frontend interfaces with high design quality. Use this skill when the user asks to build web components, pages, or applications. Generates creative, polished code that avoids generic AI aesthetics.
openai-whisper-api
351.4kTranscribe audio via OpenAI Audio Transcriptions API (Whisper).
qqbot-media
351.4kQQBot 富媒体收发能力。使用 <qqmedia> 标签,系统根据文件扩展名自动识别类型(图片/语音/视频/文件)。

