NeuroPyxels
NeuroPyxels (npyx) is a python library built for electrophysiologists using Neuropixels electrodes. This package stems from the need of a pythonist who really did not want to transition to MATLAB to work with Neuropixels: it features a suite of core utility functions for loading, processing and plotting Neuropixels data.
Install / Use
/learn @m-beau/NeuroPyxelsREADME
NeuroPyxels: loading, processing and plotting Neuropixels data in Python</h1> <img src="https://raw.githubusercontent.com/m-beau/NeuroPyxels/master/images/NeuroPyxels_logo_final.png" width="150" title="Neuropyxels" alt="Neuropixels" align="right" vspace = "50">
NeuroPyxels (npyx) is a python library built for electrophysiologists using Neuropixels electrodes. This package results from the needs of a pythonist who really did not want to transition to MATLAB to work with Neuropixels: it features a suite of core utility functions for loading, processing and plotting Neuropixels data.
❓Any questions or issues?: Create a github issue to get support, or create a pull request. Alternatively, you can email us: maximebeaujeanroch047[at]gmail[dot]com. You can also use the Neuropixels slack workgroup.
- ⬇️ Installation
- 🤗 Support and citing
- 🔍️ Documentation
- 💡 Design philosophy
- 📁 Directory structure
- 👉 Common use cases
- Load recording metadata
- Load synchronization channel
- Get good units from dataset
- Load spike times from unit u
- Load waveforms from unit u
- Compute auto/crosscorrelogram between 2 units
- Plot waveform and crosscorrelograms of unit u
- Preprocess your waveforms and spike trains
- Plot chunk of raw data with overlaid units
- Plot peri-stimulus time histograms across neurons and conditions
- Merge datasets acquired on two probes simultaneously
- ⭐ Bonus: matplotlib plot prettifier (mplp)
⬇️ Installation:
We recommend using a conda environment. Pre-existing packages on a python installation might be incompatible with npyx and break your installation. You can find instructions on setting up a conda environment here.
Using uv (recommended - faster):
conda create -n my_env python=3.13 # python <=3.13 supported
conda activate my_env
uv pip install npyx # faster alternative to pip install npyx
# optionally (see 'Dealing with cupy' section below):
conda install -c conda-forge cupy cudatoolkit=11.0 # cupy isn't available for apple silicon macbooks
# test installation:
python -c 'import npyx' # should not return any error
Using pip (also works but slower):
conda create -n my_env python=3.13 # python <=3.13 supported
conda activate my_env
pip install npyx
# optionally (see 'Dealing with cupy' section below):
conda install -c conda-forge cupy cudatoolkit=11.0
# test installation:
python -c 'import npyx' # should not return any error
<details>
<summary>Advanced installation</summary>
-
if you want the very latest version:
conda create -n my_env python=3.13 conda activate my_env pip install git+https://github.com/m-beau/NeuroPyxels@master # optionally (see 'Dealing with cupy' section below): conda install -c conda-forge cupy cudatoolkit=11.0 # test installation: python -c 'import npyx' # should not return any error -
If you want to edit npyx locally and eventually contribute:
💡 Tip: in an ipython/jupyter session, use
%load_ext autoreloadthen%autoreload 2to make your local edits active in your session without having to restart your kernel. Amazing for development.conda create -n my_env python=3.13 conda activate my_env cd path/to/save_dir # any directory where your code will be accessible by your editor and safe. NOT downloads folder. git clone https://github.com/m-beau/NeuroPyxels cd NeuroPyxels pip install . # this will create an egg link to save_dir, which means that you do not need to reinstall the package each time you edit it (e.g. after pulling from github). # optionally (see 'Dealing with cupy' section below): conda install -c conda-forge cupy cudatoolkit=11.0 # test installation: python -c 'import npyx' # should not return any errorand pull every now and then:
cd path/to/save_dir/NeuroPyxels git pull # And that's it, thanks to the egg link no need to reinstall the package!
Dealing with cupy (GPU shenanigans)
To run some preprocessing functions, you will need NVIDIA drivers and cuda-toolkit installed on your computer. It is a notorious source of bugs. To test your CUDA installation do the following:
nvidia-smi # Should show how much your GPU is being used right now
nvcc # This is the CUDA compiler
If it doesn't work, try up/downgrading the version of cudatoolkit installed:
# check the current version
conda activate my_env
conda list cudatoolkit
# E.g. install version 10.0
conda activate my_env
conda remove cupy, cudatoolkit
conda install -c conda-forge cupy cudatoolkit=10.0
Test installation
You can use the built-in unit testing function 'test_npyx' to make sure that npyx core functions run smoothly, all at once.
from npyx.testing import test_npyx
# any spike sorted recording compatible with phy
# (e.g. kilosort output)
dp = 'datapath/to/myrecording'
test_npyx(dp)
# if any test fails, re-run them with the following to print the error log, and try to fix it or post an issue on github:
test_npyx(dp, raise_error=True)
<span style="color:#1F45FC">
--- npyx version 2.3.4 unit testing initiated, on directory /media/maxime/AnalysisSSD/test_dataset_artefact... <br>
--- Successfully ran 'read_metadata' from npyx.inout. <br> --- Successfully ran 'get_npix_sync' from npyx.inout. <br> --- Successfully ran 'get_units' from npyx.gl. <br> --- Successfully ran 'ids' from npyx.spk_t. <br> --- Successfully ran 'trn' from npyx.spk_t. <br> --- Successfully ran 'trn_filtered' from npyx.spk_t. <br> --- Successfully ran 'wvf' from npyx.spk_wvf. <br> --- Successfully ran 'wvf_dsmatch' from npyx.spk_wvf. <br> --- Successfully ran 'get_peak_chan' from npyx.spk_wvf. <br> --- Successfully ran 'templates' from npyx.spk_wvf. <br> --- Successfully ran 'ccg' from npyx.corr. <br> --- Successfully ran 'plot_wvf' from npyx.plot. <br> --- Successfully ran 'plot_ccg' from npyx.plot. <br> --- Successfully ran 'plot_raw' from npyx.plot. <br>
</span>(bunch of plots...)
<details>
<summary>:warning: Known installation issues</summary>
-
cannot import numba.core hence cannot import npyx <br/> Older versions of numba did not feature the .core submodule. If you get this error, you are probably running a too old version of numba. Make sure that you have installed npyx in a fresh conda environment if that happens to you. If you still get an error, check that numba is not installed in your root directory.
pip uninstall numba conda activate my_env pip uninstall numba pip install numba
- core dumped when importing <br/> This seems to be an issue related to PyQt5 required by opencv (opencv-python). Solution (from post):
# activate npyx environment first
pip uninstall PyQt5
pip uninstall opencv-python
pip install opencv-python
# pip install other missing dependencies
Full log:
In [1]: from npyx import *
In [2]: QObject::moveToThread: Current thread (0x5622e1ea6800) is not the object's thread (0x5622e30e86f0).
Cannot move to target thread (0x5622e1ea6800)
qt.qpa.plugin: Could not load the Qt platform plugin "xcb" in "/home/maxime/miniconda3/envs/npyx/lib/python3.7/site-packages/cv2/qt/plugins" even though it was found.
This application failed to start because no Qt platform plugin could be initialized. Reinstalling the application may fix this problem.
Available platform plugins are: xcb, eglfs, linuxfb, minimal, minimalegl, offscreen, vnc, wayland-egl, wayland, wayland-xcomposite-egl, wayland-xcomposite-glx, webgl.
Aborted (core dumped)
<br/>
- I think I installed everything properly, but npyx is not found if I run 'python -c "import npyx" '! <br/> Typically:
Traceback (most recent call last):
File "<stdin>", line 1, in <module>
ModuleNotFoundError: No module named 'npyx'
Make sure that the python installation that you are using is indeed the version of your new environment. <br/> To do so, in your terminal, run "which python" on linux/mac or "where python" on windows: the output should be the path to the right environment e.g. "/home/.../anaconda/envs/npyx/bin/python". If it isn't, try to deactivate/reactivate your conda environment, or make sure you do not have
