Pyspedas
Python-based Space Physics Environment Data Analysis Software
Install / Use
/learn @spedas/PyspedasREADME
PySPEDAS
The Python-based Space Physics Environment Data Analysis Software (PySPEDAS) framework supports multi-mission, multi-instrument retrieval, analysis, and visualization of heliophysics time series data.
Projects Supported
- Advanced Composition Explorer (ACE)
- Akebono
- Arase (ERG)
- Cluster
- Colorado Student Space Weather Experiment (CSSWE)
- Communications/Navigation Outage Forecasting System (C/NOFS)
- Deep Space Climate Observatory (DSCOVR)
- Dynamics Explorer 2 (DE2)
- Equator-S
- Fast Auroral Snapshot Explorer (FAST)
- Geotail
- Geostationary Operational Environmental Satellite (GOES)
- Imager for Magnetopause-to-Aurora Global Exploration (IMAGE)
- Kyoto Dst Index
- LANL
- Mars Atmosphere and Volatile Evolution (MAVEN)
- Magnetic Induction Coil Array (MICA)
- Magnetospheric Multiscale (MMS)
- OMNI
- Polar Orbiting Environmental Satellites (POES)
- Polar
- Parker Solar Probe (PSP)
- Solar & Heliospheric Observatory (SOHO)
- Solar Orbiter (SOLO)
- Solar Terrestrial Relations Observatory (STEREO)
- Space Technology 5 (ST5)
- Spherical Elementary Currents (SECS)
- Swarm
- Time History of Events and Macroscale Interactions during Substorms (THEMIS)
- Two Wide-Angle Imaging Neutral-Atom Spectrometers (TWINS)
- Ulysses
- Van Allen Probes (RBSP)
- Wind
Requirements
Python 3.10+ is required.
We recommend Anaconda, which gives you a relatively easy way to manage your Python installations, and comes with a suite of packages useful for scientific data analysis. Step-by-step instructions for installing Anaconda can be found at: Windows, macOS, Linux
Anaconda is not a requirement - if you prefer to download and install Python directly from python.org, and manage your own virtual environments, PySPEDAS will work just fine. However, some PySPEDAS dependencies may be hard to install cleanly via 'pip', and it's nice to have the option of trying 'conda install', which is one reason that we recommend installing Python with Anaconda.
Most people prefer to use a Python IDE like PyCharm, Visual Studio Code, Spyder, etc. to do their Python programming. Each IDE has its own way of setting up Python environments for your projects, so please consult the documentation for your preferred tool set.
Installation
Virtual Environment
To avoid potential dependency conflicts with other Python packages, we suggest creating a virtual environment for PySPEDAS; you can create a virtual environment in your terminal with:
python -m venv pyspedas
To enter your virtual environment, run the 'activate' script:
Windows
.\pyspedas\Scripts\activate
macOS and Linux
source pyspedas/bin/activate
Using Jupyter notebooks with your virtual environment
To get virtual environments working with Jupyter, in the virtual environment, type:
pip install ipykernel
python -m ipykernel install --user --name pyspedas --display-name "Python (pySPEDAS)"
(note: "pyspedas" is the name of your virtual environment)
Then once you open the notebook, go to "Kernel" then "Change kernel" and select the one named "Python (PySPEDAS)"
Install
With the release of PySPEDAS 2.0, PySPEDAS has a number of dependencies that are now optional. That can make it easier to install on platforms that may not have binary wheels available for certain dependencies. There are additional dependencies that are only useful for developing and maintaining PySPEDAS itself.
PySPEDAS supports Windows, macOS and Linux. To get started, install the pyspedas package using PyPI:
pip install pyspedas
Upgrade
To upgrade to the latest version of PySPEDAS:
pip install pyspedas --upgrade
Local Data Directories
The recommended way of setting your local data directory is to set the SPEDAS_DATA_DIR environment variable. SPEDAS_DATA_DIR acts as a root data directory for all missions, and will also be used by IDL (if you’re running a recent copy of the bleeding edge).
Mission specific data directories (e.g., MMS_DATA_DIR for MMS, THM_DATA_DIR for THEMIS) can also be set, and these will override SPEDAS_DATA_DIR
Cloud Repositories
SPEDAS_DATA_DIR and mission specific data directories can also be the URI of a cloud repository (e.g., an S3 repository). If this data directory is set to an URI, files will be downloaded from the data server to the URI location. The data will then be streamed from the URI without needing to download the file locally.
In order to successfully access the specified cloud repository, the user is required to correctly set up permissions to be able to read and write to that cloud repository on their own. Refer (here)[https://docs.aws.amazon.com/cli/v1/userguide/cli-configure-files.html] for how to prepare your AWS configuration and credentials.
Usage
You can load data into tplot variables by calling pyspedas.projects.mission.instrument(), e.g.,
To load and plot 1 day of THEMIS FGM data for probe 'd':
import pyspedas
from pyspedas import tplot
thm_fgm = pyspedas.projects.themis.fgm(trange=['2015-10-16', '2015-10-17'], probe='d')
tplot(['thd_fgs_gse', 'thd_fgs_gsm'])
The above example used the fully qualified load routine name pyspedas.projects.themis.fgm.
It is also possible to use abbreviated names by importing them from the appropriate mission module:
To load and plot 2 minutes of MMS burst mode FGM data:
from pyspedas.projects.mms import fgm
from pyspedas import tplot
mms_fgm = fgm(trange=['2015-10-16/13:05:30', '2015-10-16/13:07:30'], data_rate='brst')
tplot(['mms1_fgm_b_gse_brst_l2', 'mms1_fgm_b_gsm_brst_l2'])
Note: by default, PySPEDAS loads all data contained in CDFs found within the requested time range; this can potentially load data outside of your requested trange. To remove the data outside of your requested trange, set the time_clip keyword to True
To load and plot 6 hours of PSP SWEAP/SPAN-i data:
import pyspedas
from pyspedas import tplot
spi_vars = pyspedas.projects.psp.spi(trange=['2018-11-5', '2018-11-5/06:00'], time_clip=True)
tplot(['DENS', 'VEL', 'T_TENSOR', 'TEMP'])
To download 5 days of STEREO magnetometer data (but not load them into tplot variables):
import pyspedas
stereo_files = pyspedas.projects.stereo.mag(trange=['2013-11-1', '2013-11-6'], downloadonly=True)
Standard Load Routine Options
trange: two-element list specifying the time range of interest. This keyword accepts a wide range of formatstime_clip: if set, clip the variables to the exact time range specified by thetrangekeywordsuffix: string specifying a suffix to append to the loaded variablesvarformat: string specifying which CDF variables to load; accepts the wild cards * and ?varnames: string specifying which CDF variables to load (exact names)get_support_data: if set, load the support variables from the CDFsdownloadonly: if set, download the files but do not load them into tplotno_update: if set, only load the data from the local cachenotplot: if set, load the variables into dictionaries containing numpy arrays (instead of creating the tplot variables)
Examples
Please see the following notebooks for examples of using
Related Skills
claude-opus-4-5-migration
92.1kMigrate prompts and code from Claude Sonnet 4.0, Sonnet 4.5, or Opus 4.1 to Opus 4.5
model-usage
343.3kUse CodexBar CLI local cost usage to summarize per-model usage for Codex or Claude, including the current (most recent) model or a full model breakdown. Trigger when asked for model-level usage/cost data from codexbar, or when you need a scriptable per-model summary from codexbar cost JSON.
feishu-drive
343.3k|
things-mac
343.3kManage Things 3 via the `things` CLI on macOS (add/update projects+todos via URL scheme; read/search/list from the local Things database)
