BrainCAP
Analysis Toolkit to investigate co-activation patterns in functional Magnetic Resonance Imaging (fMRI)
Install / Use
/learn @Kangjoo/BrainCAPREADME
BrainCAP
The analysis of moment-to-moment changes in co-activation patterns (CAPs) in functional MRI (fMRI) has been useful for studying dynamic properties of neural activity. This method is based on clustering fMRI time-frames into several recurrent spatial patterns within and across subjects. Studies have also focused on quantifying properties of the temporal organization of CAPs, such as fractional occupancy and dwell time. The analyses of co-activations are computationally intensive, requiring the clustering of high-dimensional data concatenated over subjects. Further, while a variety of analytic choices are involved in studying CAPs, the field lacks a unified open-source platform to allow a robust feature selection required for reproducible mappings of brain and behavioral measurements. We are currently developing BrainCAP, an open-source Python-based toolkit for quantifying CAPs from fMRI data in cross-sectional and longitudinal studies.
This repository serves as the main branch for the ongoing development and enhancement of BrainCAP.

Important Notes
- The official release of BrainCAP has not been announced yet. The developer team is working on the first release of BrainCAP.
- The current version of this repository is specifically tailored for research environments with access to Yale University’s High Performance Computing (HPC) cluster. Future versions will allow the use of various job schedulers, in addition to Slurm.
Branches
main
The main branch contains the latest developments and optimizations for BrainCAP. These codes are optimized for local use on the Yale University High-Performance Computing (HPC) cluster. If you are looking to reproduce the data and results from Lee et al. (2024), please refer to the archived version on Zenodo linked above.
develop
The develop branch focuses on building an open-source software toolkit for BrainCAP. We aim to release the first version of this open-source toolkit by the end of 2025. Contributions, feedback, and collaboration are welcome to help shape the future of BrainCAP.
Citation
If you use BrainCAP in your research, please cite:
Kangjoo Lee, Jie Lisa Ji, Clara Fonteneau, Lucie Berkovitch, Masih Rahmati, Lining Pan, Grega Repovš, John H. Krystal, John D. Murray, and Alan Anticevic, Human brain state dynamics are highly reproducible and associated with neural and behavioral features, PLOS Biology 22(9): e3002808 (2024)
The specific version of the code used for Lee et al. (2024) is archived and available at Zenodo.
Maintainers
BrainCAP is currently maintained by:
-
Kangjoo Lee, PhD<br> Assistant Professor, Department of Biomedical Engineering<br> Director, Computational Neuroimaging of Brain Disorders Laboratory<br> College of Engineering and Computing<br> Florida International University<br> 10555 W Flagler St, EC 2675, Miami, FL 33174<br> Phone: (305) 348-7340<br> Email: kalee@fiu.edu
-
Samuel Brege, BS, Research Assistant
Email: samuel.brege@yale.edu
For inquiries, questions, or collaborations, please contact Dr. Lee
Pipeline
We presetned BrainCAP and our demo codes at the software demonstration session of the 2025 meeting of the Organizaiton for Human Brain Mapping (OHBM) in Brisbane, Australia. Find the overview of the BrainCAP pipeline below.

Related Skills
node-connect
352.9kDiagnose OpenClaw node connection and pairing failures for Android, iOS, and macOS companion apps
frontend-design
111.5kCreate distinctive, production-grade frontend interfaces with high design quality. Use this skill when the user asks to build web components, pages, or applications. Generates creative, polished code that avoids generic AI aesthetics.
openai-whisper-api
352.9kTranscribe audio via OpenAI Audio Transcriptions API (Whisper).
qqbot-media
352.9kQQBot 富媒体收发能力。使用 <qqmedia> 标签,系统根据文件扩展名自动识别类型(图片/语音/视频/文件)。
