GCOPE
Official implementation of 'All in One and One for All: A Simple yet Effective Method towards Cross-domain Graph Pretraining' published in KDD'2024
Install / Use
/learn @cshhzhao/GCOPEREADME
This is the official implementation of the following paper:
All in One and One for All: A Simple yet Effective Method towards Cross-domain Graph Pretraining [Paper], KDD'24
Haihong Zhao*, Aochuan Chen*, Xiangguo sun*†, Hong Cheng, Jia Li†
Setup Environment
1. cd ./environement
2. conda env create -f environment.yml
Run Script
pretrain.json is the configuration file of pretraining phase, such as the selected pretraining method, pretraining epoch, etc.
iterate_supplementary_on_pretrain_with_transferring.sh is the running file for executing the GCOPE framework, where the shell file gives several options to determine the backbone model (e.g., fagcn, gcn, gat, bwgnn), target downstream transferring dataset (e.g., photo), few_shot (i.e., the num of training samples), backbone_tuning (1 for finetuning the gnn backbone and 0 for freezing the backbone), learning_rates, etc.
Concretely, if you have installed the environment file successfully, you can directly run the following codes to evaluate our proposed GCOPE:
chmod +x iterate_supplementary_on_pretrain_with_transferring.sh
./iterate_supplementary_on_pretrain_with_transferring.sh
If you want to change the backbone or others, just modify the pretrain.json or the parameters in iterate_supplementary_on_pretrain_with_transferring.sh.
Related Skills
node-connect
351.8kDiagnose OpenClaw node connection and pairing failures for Android, iOS, and macOS companion apps
frontend-design
110.9kCreate distinctive, production-grade frontend interfaces with high design quality. Use this skill when the user asks to build web components, pages, or applications. Generates creative, polished code that avoids generic AI aesthetics.
openai-whisper-api
351.8kTranscribe audio via OpenAI Audio Transcriptions API (Whisper).
qqbot-media
351.8kQQBot 富媒体收发能力。使用 <qqmedia> 标签,系统根据文件扩展名自动识别类型(图片/语音/视频/文件)。
