SkillAgentSearch skills...

SocialCircle

⭕️ Official codes for "SocialCircle: Learning the Angle-based Social Interaction Representation for Pedestrian Trajectory Prediction" (CVPR2024)

Install / Use

/learn @cocoon2wong/SocialCircle
About this skill

Quality Score

0/100

Supported Platforms

Universal

README

<!-- * @Author: Conghao Wong * @Date: 2023-08-21 15:58:54 * @LastEditors: Conghao Wong * @LastEditTime: 2024-09-29 09:42:17 * @Description: file content * @Github: https://cocoon2wong.github.io * Copyright 2023 Conghao Wong, All Rights Reserved. -->

SocialCircle

This is the official codes (TensorFlow2 version) for "SocialCircle: Learning the Angle-based Social Interaction Representation for Pedestrian Trajectory Prediction".

We also provide the corresponding PyTorch codes (beta version for now) in the TorchVersion(beta) branch. Please note that model weights that trained with different backends (TensorFlow2 or PyTorch) can not be used in the other version of codes. Weights released in this page are trained under TensorFlow2. For weights trained with PyTorch, please refer to this page.

2024.06.06 Update: The code repository for our subsequent work "SocialCircle+: Learning the Angle-based Conditioned Interaction Representation for Pedestrian Trajectory Prediction" is now available at SocialCirclePlus. If you are interested in our work, feel free to try it! It is used in exactly the same way as the SocialCircle repository, and is compatible with the pytorch version of SocialCircle model weights.

Authors' Note

This work is the first part in our Echolocation Trilogy. It focuses on how to describe and locate echoes from agents spatially. The third work in this series, Reverberation, is now available on arXiv.

Here are all the repositories involved in our trilogy:

  • Part I, Where do the echoes come from?: SocialCircle (CVPR 2024) and SocialCirclePlus (Journal, Under Review);
  • Part II, How echoes interact with each other?: Resonance (ICCV 2025);
  • Part III, How long do the echoes last?: Reverberation (Journal, Under Review).

Note that these repositories share the same training engine and the weight files are compatible with each other (in the order in which the repositories were released, the later released repositories are compatible with the weights of the previous releases, the Reverberation repository is recommended as it is compatible with all previous models). You can copy only the core model folders, e.g. SocialCircle, Re, Rev, etc., to a repository's root path (i.e., the folder where qpid is located), and train and test the models via main.py.

Get Started

You can clone this repository by the following command:

git clone https://github.com/cocoon2wong/SocialCircle.git

Then, run the following command to initialize all submodules:

git submodule update --init --recursive

Requirements

The codes are developed with Python 3.9. Additional packages used are included in the requirements.txt file.

{: .box-warning} Warning: We recommend installing all required Python packages in a virtual environment (like the conda environment). Otherwise, there COULD be other problems due to the package version conflicts.

Run the following command to install the required packages in your Python environment:

pip install -r requirements.txt

Dataset Prepare and Process

ETH-UCY, SDD, NBA, nuScenes

{: .box-warning} Warning: If you want to validate SocialCircle models on these datasets, make sure you are getting this repository via git clone and that all gitsubmodules have been properly initialized via git submodule update --init --recursive.

You can run the following commands to prepare dataset files that have been validated in our paper:

  1. Run Python the script inner the dataset_original folder:

    cd dataset_original
    
    • For ETH-UCY and SDD, run

      python main_ethucysdd.py
      
    • For NBA or nuScenes, you can download their original dataset files, put them into the given path listed within dataset_original/main_nba.py or dataset_original/main_nuscenes.py, then run

      python main_nba.py
      python main_nuscenes.py
      

      (You can also download the processed dataset files manually from here, and put them into dataset_processed and dataset_configs folders.)

  2. Back to the repo folder and create soft links:

    cd ..
    ln -s dataset_original/dataset_processed ./
    ln -s dataset_original/dataset_configs ./
    

Dataset Corrections: The univ13 split (ETH-UCY) takes univ and univ3 as test sets, and other sets {eth, hotel, unive, zara1, zara2, zara3} as training sets. Differently, the univ split only includes univ for testing models. Our reported results in this conference paper are tested under split univ. Following most current approaches, we have test them and report results with the new split univ13 in the corresponding journal-expanded paper SocialCircle+ (codes repo here, paper available on arXiv). Correspondingly, some SocialCircle results have been corrected, and please check them in the SocialCirclePlus repo with the newly trained weights in the weights repo (postfixed with univ13).

Click the following button to learn more about how to process these dataset files.

<div style="text-align: center;"> <a class="btn btn-colorful btn-lg" href="https://cocoon2wong.github.io/Project-Luna/howToUse/">💡 Dataset Guidelines</a> </div>

Prepare Your New Datasets

Before training SocialCircle models on your own dataset, you should add your dataset information. See this document for details.

Pre-Trained Model Weights and Evaluation

We have provided our pre-trained model weights to help you quickly evaluate the SocialCircle models' performance.

Click the following buttons to download our model weights. We recommend that you download the weights and place them in the weights/SocialCircle folder.

<div style="text-align: center;"> <a class="btn btn-colorful btn-lg" href="https://github.com/cocoon2wong/SocialCircle/releases">⬇️ Download Weights (TensorFlow 2)</a> <a class="btn btn-colorful btn-lg" href="https://github.com/cocoon2wong/Project-Monandaeg/tree/SocialCircle">⬇️ Download Weights (PyTorch)</a> </div>

{: .box-warning} Warning: The TensorFlow 2 version of codes only support weights that trained with TensorFlow 2, and the PyTorch version of codes only support weights that trained with PyTorch. Please download the correct weights file or the program will not run correctly.

You can start evaluating models by

python main.py --sc SOME_MODEL_WEIGHTS

Here, SOME_MODEL_WEIGHTS is the path of the weights folder, for example, weights/SocialCircle/evsc_P8_sdd.

Training

You can start training a SocialCircle model via the following command:

python main.py --model MODEL_IDENTIFIER --split DATASET_SPLIT

Here, MODEL_IDENTIFIER is the identifier of the model. These identifiers are supported in current codes:

  • The basic transformer model for trajectory prediction:
    • trans (named the Transformer in the paper);
    • transsc (SocialCircle variation Transformer-SC).
  • MSN (🔗homepage):
    • msna (original model);
    • msnsc (SocialCircle variation).
  • V^2-Net (🔗homepage):
    • va (original model);
    • vsc (SocialCircle variation).
  • E-V^2-Net (🔗homepage):
    • eva (original model);
    • evsc (SocialCircle variation).

DATASET_SPLIT is the identifier (i.e., the name of dataset's split files in dataset_configs, for example eth is the identifier of the split list in dataset_configs/ETH-UCY/eth.plist) of the dataset or splits used for training. It accepts:

  • ETH-UCY: {eth, hotel, univ, zara1, zara2};
  • SDD: sdd;
  • NBA: nba50k;
  • nuScenes: {nuScenes_v1.0, nuScenes_ov_v1.0};

For example, you can start training the E-V^2-Net-SC model by

python main.py --model evsc --split zara1

You can also specify other needed args, like the learning rate --lr, batch size --batch_size, etc. See detailed args in the Args Used Section.

In addition, the simplest way to reproduce our results is to copy all training args we used in the provided weights. For example, you can start a training of E-V^2-Net-SC on zara1 by:

python main.py --restore_args weights/SocialCircle/evsczara1

Toy Example

You can run the following script to learn how the proposed SocialCircle works in an interactive way:

python scripts/socialcircle_toy_example.py

Set positions of the manual neighbor to see model's outputs like:

<div style="text-align: center;"> <img style="width: 100%;" src="./img/toy_example.png"> </div>

Args Used

Please specify your customized args when training or testing your model in the following way:

python main.py --ARG_KEY1 ARG_VALUE2 --ARG_KEY2 ARG_VALUE2 -SHORT_ARG_KEY3 ARG_VALUE3 ...

where ARG_KEY is the name of args, and ARG_VALUE is the corresponding value. All args and their usages are listed below.

About the argtype:

  • Args with argtype=static can not be changed once after training. When testing the model, the program will not
View on GitHub
GitHub Stars64
CategoryEducation
Updated8d ago
Forks7

Languages

Python

Security Score

100/100

Audited on Mar 19, 2026

No findings