SkillAgentSearch skills...

OpenTLP

[ACM Computing Surveys'23] Implementations or refactor of some temporal link prediction/dynamic link prediction methods and summary of related open resources for survey paper "Temporal Link Prediction: A Unified Framework, Taxonomy, and Review" which has been accepted by ACM Computing Surveys.

Install / Use

/learn @KuroginQin/OpenTLP

README

Awesome Stars Forks

OpenTLP

This repository is the open-source project of a survey paper entitled "Temporal Link Prediction: A Unified Framework, Taxonomy, and Review", which has been accepted by ACM Computing Surveys. It refactors or implements some representative temporal link prediction (TLP, a.k.a. dynamic link prediction) methods (especially for those do not provide their source code) based on the unified encoder-decoder framework and terminologies (regarding task settings, taxonomy, etc.) introduced in our survey paper. In addition, this repository also summarizes some other open-source projects regarding TLP. We will keep updating this repository to include some other (SOTA or classic) TLP methods, task settings, dynamic graph datasets, etc.

Note that this repository is not the official implementation of related methods. Some of the implemented TLP approaches also need careful parameter tuning to achieve the best performance on different datasets with different task settings.

Abstract

Dynamic graphs serve as a generic abstraction and description of the evolutionary behaviors of various complex systems (e.g., social networks and communication networks). Temporal link prediction (TLP) is a classic yet challenging inference task on dynamic graphs, which predicts possible future linkage based on historical topology. The predicted future topology can be used to support some advanced applications on real-world systems (e.g., resource pre-allocation) for better system performance. This survey provides a comprehensive review of existing TLP methods. Concretely, we first give the formal problem statements and preliminaries regarding data models, task settings, and learning paradigms that are commonly used in related research. A hierarchical fine-grained taxonomy is further introduced to categorize existing methods in terms of their data models, learning paradigms, and techniques. From a generic perspective, we propose a unified encoder-decoder framework to formulate all the methods reviewed, where different approaches only differ in terms of some components of the framework. Moreover, we envision serving the community with an open-source project OpenTLP that refactors or implements some representative TLP methods using the proposed unified framework and summarizes other public resources. As a conclusion, we finally discuss advanced topics in recent research and highlight possible future directions.

Citing

If you find this project useful for your research, please cite our survey paper.

@article{qin2023temporal,
  title={Temporal link prediction: A unified framework, taxonomy, and review},
  author={Qin, Meng and Yeung, Dit-Yan},
  journal={ACM Computing Surveys},
  volume={56},
  number={4},
  pages={1--40},
  year={2023},
  publisher={ACM New York, NY, USA}
}

If you have any questions regarding this repository, you can contact the author via [mengqin_az@foxmail.com].

Outline

Notations

  • ESSD - Evenly-Sampled Snapshot Sequence Description, a.k.a. Discrete-Time Dynamic Graph (DTDG) in other literature.

  • UESD - Unevenly-Sampled Edge Sequence Description, a.k.a. Continuous-Time Dynamic Graph (CTDG) in other literature.

(We argure that CTDG cannot precisely describe the corresponding data model. Although the time index associated with each edge is defined on a continuous domain in the second data model, the edge sequence used to encode the dynamic topology is still discrete. The term of continuous description may be ambiguous in some cases. Therefore, we use ESSD & UESD instead of DTDG & CTDG.)

  • DI - Direct Inference.

  • OTI - Online Training & Inference.

  • OTOG - Offline Training & Online Generalization.

  • HQ - Able to Derive High-Quality Prediction Results for Weighted TLP.

  • D/L-Dep - the original version of a method cannot support the weighted TLP but it can be easily extended to tackle such a setting by replacing an appropriate decoder or loss.

Implemented or Refactored TLP Methods

We implement some representative TLP methods using MATLAB and PyTorch, with the corresponding code and data put under the direcories './Matlab' and './Python'.

For each method implemented by MATLAB, [method_name].m provides the functions that give the definitions of encoder and decoder, where the loss function is derived during the optimization of encoder. [method_name]_demo.m demonstrates how to use these functions. To run the demonstration code, please first unzip the data.zip in ./Matlab/data.

For each method implemented by PyTorch, a set of classes are used to define the encoder, decoder, and loss function, which are put under the directory './Python/[method_name]'. Furthermore, [method_name].py demonstrates how to use these classes. To run the demonstration code, please first unzip the data.zip in ./Python/data. In particular, these methods (implemented by PyTorch) can be speeded up via GPUs.

Details of the implemented TLP methods are summarized as follows.

| Methods | Publication Venues | Data Models | Paradigms | Level | Attributes | Weighted TLP | Language | | ---- | ---- | ---- | ---- | ---- | ---- | ---- | ---- | | CRJMF [1] | CIKM 2011 | ESSD | OTI | 1 | Static | Able | MATLAB | | GrNMF [2] | Physica A 2018 | ESSD | OTI | 1 | N/A | Able | MATLAB | | DeepEye [3] | BDMA 2018 | ESSD | OTI | 1 | N/A | Able | MATLAB | | AM-NMF [4] | SIGCOMM 2018 NetAI WKSP | ESSD | OTI | 1 | N/A | Able | MATLAB | | TMF [5] | WSDM 2017 | ESSD | OTI | 1 | N/A | Able | Python | | LIST [6] | IJCAI 2017 | ESSD | OTI | 1 | N/A | Able | Python | | Dyngraph2vec [7] | KBS 2020 | ESSD | OTOG | 1 | N/A | Able | Python | | DDNE [8] | IEEE Access 2018 | ESSD | OTOG | 1 | N/A | D/L-Dep | Python | | E-LSTM-D [9] | Trans on SMC: Sys 2019 | ESSD | OTOG | 1 | N/A | Able | Python | | GCN-GAN [10] | InfoCom 2019 | ESSD | OTOG | 1 | N/A | Able (HQ) | Python | | NetworkGAN [11] | Trans on Cyb 2019 | ESSD | OTOG | 1 | N/A | Able (HQ) | Python | | STGSN [12] | KBS 2021 | ESSD | OTOG | 2 | Dynamic | Able | Python |

Other Open Source Projects Regarding TLP

| Methods | Publication Venues | Data Models | Paradigms | Level | Attributes | Weighted TLP| | ---- | ---- | ---- | ---- | ---- | ---- | ---- | | TLSI [13] (Code) | TKDE 2016 | ESSD | OTI | 1 | N/A | Able | | MLjFE [14] (Code) | Pattern Recognition 2022 | ESSD | OTI | 1 | N/A | Able | | E-LSTM-D [9] (Code) | Trans on SMC: Sys 2019 | ESSD | OTOG | 1 | N/A | Able | | EvolveGCN [15] (Code) | AAAI 2020 | ESSD | OTOG | 2 | Dynamic | D/L-Dep | | CTDNE [16] (Code) | WWW 2018 Companion | UESD | OTOG | 1 | N/A | Unable | | M2DNE [17] (Code) | CIKM 2019 | UESD | OTOG | 1 | N/A | Unable | | DyRep [18] (Code) | ICLR 2019 | UESD | OTOG | 2 | Dynamic | Unable | | TGAT [19] (Code) | ICLR 2020 | UESD | OTOG | 2 | Dynamic | Unable | | CAW [20] (Code) | ICLR 2021 | UESD | OTOG | 2 | Dynamic | Unable | | DySAT [[21]](https://dl.acm.org/doi/pdf/10.1145/3336191.3371845?casa_toke

Related Skills

View on GitHub
GitHub Stars59
CategoryEducation
Updated4mo ago
Forks9

Languages

Python

Security Score

82/100

Audited on Nov 11, 2025

No findings