SkillAgentSearch skills...

Pixel2Cancer

[MICCAI 2024] Cellular Automata for Tumor Development - Realistic Synthetic Tumors in Liver, Pancreas, and Kidney

Install / Use

/learn @MrGiovanni/Pixel2Cancer
About this skill

Quality Score

0/100

Supported Platforms

Universal

README

<h1 align="center">Pixel2Cancer</h1> <div align="center">

visitors GitHub Repo stars <a href="https://twitter.com/bodymaps317"> <img src="https://img.shields.io/twitter/follow/BodyMaps?style=social" alt="Follow on Twitter" /> </a><br/> Subscribe us: https://groups.google.com/u/2/g/bodymaps

</div>

This repository provides the code and checkpoints for our novel tumor synthesis approach, Pixel2Cancer, which can simulate tumor development within organs with realistic texture, shape, and interactions with other tissues.

Simulation of Tumor Growth

Paper

<b>From Pixel to Cancer: Cellular Automata in Computed Tomography</b> <br/> Yuxiang Lai<sup>1,2</sup>, Xiaoxi Chen<sup>3</sup>, Angtian Wang<sup>1</sup>, Alan L. Yuille<sup>1</sup>, and Zongwei Zhou<sup>1,*</sup> <br/> <sup>1 </sup>Johns Hopkins University <br/> <sup>2 </sup>Southeast University, <br/> <sup>3 </sup>University of Illinois Urbana-Champaign <br/> International Conference on Medical Image Computing and Computer Assisted Intervention (MICCAI 2024; Early Accept) <br/> paper | code

We have summarized publications related to tumor synthesis in Awesome Synthetic Tumors Awesome.

Model

You can download the trained model files from the table below

| Organ | Tumor | Model | Pre-trained? | Download | |----------|-------|--------------------|--------------|------------------------------------------------------------------------------------------------| | liver | real | unet | no | link | | liver | real | swin_unetrv2_base | no | link | | liver | synt | unet | no | link | | liver | synt | swin_unetrv2_base | no | link | | pancreas | real | unet | no | link | | pancreas | real | swin_unetrv2_base | no | link | | pancreas | synt | unet | no | link | | pancreas | synt | swin_unetrv2_base | no | link | | kidney | real | unet | no | link | | kidney | real | swin_unetrv2_base | no | link | | kidney | synt | unet | no | link | | kidney | synt | swin_unetrv2_base | no | link |

Where to put those model files?

It is suggested that you put those into a folder named runs/your_model_name, which will be the same path to save the trained model after training. For example, runs/synt.no_pretrain.unet/model.pt for the non-pretrained U-Net model trained with synthetic tumors.

You can download other materials from these links:

All other checkpoints: link

Data Download

(1). 🚍 Public datasets:

Below are the public dataset download links for different organs used in medical image analysis tasks:

| 🧠 Organ | 🔗 Download Link | 📖 Dataset Name | |------------|------------------|------------| | Liver | Download LiTS Dataset | 04_LiTS | | Kidney | Download KiTS Dataset | 05_KiTS | | Pancreas | Download Pancreas Dataset | Task07_Pancreas |

(2). 🚗 Our private 9k data of AbdominalAtlas1.1:

The release of AbdomenAtlas 1.0 can be found at This Link.

(3). Where to put those data files:

It is suggested that you put your download dataset folder under the folder your_data_path. For example, put the Liver dataset 04_LiTS under the path Pixel2Cancer/your_data_path/04_LiTS.

0. Installation

git clone https://github.com/MrGiovanni/Pixel2Cancer.git
cd Pixel2Cancer/

# download pre-trained models
wget https://github.com/Project-MONAI/MONAI-extra-test-data/releases/download/0.8.1/model_swinvit.pt

🚨 See detailed installation instructions to create an environment and obtain requirements.

1. Train segmentation models using synthetic tumors

datapath= your_data_path

# UNET (no.pretrain)
CUDA_VISIBLE_DEVICES=0,1,2,3 python -W ignore main.py --optim_lr=4e-4 --batch_size=2 --lrschedule=warmup_cosine --optim_name=adamw --model_name=unet --val_every=200 --max_epochs=2000 --save_checkpoint --workers=0 --noamp --distributed --dist-url=tcp://127.0.0.1:12235 --cache_num=200 --val_overlap=0.5 --syn --logdir="runs/synt.no_pretrain.unet" --train_dir $datapath --val_dir $datapath --json_dir datafolds/healthy.json

# Swin-UNETR-Base (pretrain)
CUDA_VISIBLE_DEVICES=0,1,2,3 python -W ignore main.py --optim_lr=4e-4 --batch_size=2 --lrschedule=warmup_cosine --optim_name=adamw --model_name=swin_unetrv2 --swin_type=base --val_every=200 --max_epochs=2000 --save_checkpoint --workers=0 --noamp --distributed --dist-url=tcp://127.0.0.1:12231 --cache_num=200 --val_overlap=0.5 --syn --logdir="runs/synt.pretrain.swin_unetrv2_base" --train_dir $datapath --val_dir $datapath --json_dir datafolds/healthy.json --use_pretrained

# Swin-UNETR-Base (no.pretrain)
CUDA_VISIBLE_DEVICES=0,1,2,3 python -W ignore main.py --optim_lr=4e-4 --batch_size=2 --lrschedule=warmup_cosine --optim_name=adamw --model_name=swin_unetrv2 --swin_type=base --val_every=200 --max_epochs=2000 --save_checkpoint --workers=0 --noamp --distributed --dist-url=tcp://127.0.0.1:12231 --cache_num=200 --val_overlap=0.5 --syn --logdir="runs/synt.no_pretrain.swin_unetrv2_base" --train_dir $datapath --val_dir $datapath --json_dir datafolds/healthy.json

# Swin-UNETR-Small (no.pretrain)
CUDA_VISIBLE_DEVICES=0,1,2,3 python -W ignore main.py --optim_lr=4e-4 --batch_size=2 --lrschedule=warmup_cosine --optim_name=adamw --model_name=swin_unetrv2 --swin_type=small --val_every=200 --max_epochs=2000 --save_checkpoint --workers=0 --noamp --distributed --dist-url=tcp://127.0.0.1:12233 --cache_num=200 --val_overlap=0.5 --syn --logdir="runs/synt.no_pretrain.swin_unetrv2_small" --train_dir $datapath --val_dir $datapath --json_dir datafolds/healthy.json

# Swin-UNETR-Tiny (no.pretrain)
CUDA_VISIBLE_DEVICES=0,1,2,3 python -W ignore main.py --optim_lr=4e-4 --batch_size=2 --lrschedule=warmup_cosine --optim_name=adamw --model_name=swin_unetrv2 --swin_type=tiny --val_every=200 --max_epochs=2000 --save_checkpoint --workers=0 --noamp --distributed --dist-url=tcp://127.0.0.1:12234 --cache_num=200 --val_overlap=0.5 --syn --logdir="runs/synt.no_pretrain.swin_unetrv2_tiny" --train_dir $datapath --val_dir $datapath --json_dir datafolds/healthy.json

⚠️ You may have to modify:

--logdir: the path to save trained model pickle
--json_dir: the path for .json file to load the dataset

🚨 [ IMPORTANT! ] About .json file location:

The .json file here is the configuration file for loading the data. For example, I want to train with mixed pancrea dataset that has both synt and original tumors, therefore, configured as: --json_dir datafolds/mix_pancreas.json. And the corrsponding dataset path will be: your_data_path\10_Decathlon\Task07_Pancreas, the same as defined in the .json file.

💡💡💡 Please check your .json file before training

2. Train segmentation models using real tumors (for comparison)

datapath= your_data_path

# UNET (no.pretrain)
CUDA_VISIBLE_DEVICES=0,1,2,3 python -W ignore -W ignore main.py --optim_lr=4e-4 --batch_size=2 --lrschedule=warmup_cosine --optim_name=adamw --model_name=unet --val_every=200 --val_overlap=0.5 --max_epochs=2000 --save_checkpoint --workers=2 --noamp --distributed --dist-url=tcp://127.0.0.1:12235 --cache_num=
View on GitHub
GitHub Stars48
CategoryDevelopment
Updated7d ago
Forks6

Languages

Jupyter Notebook

Security Score

90/100

Audited on Mar 31, 2026

No findings