SkillAgentSearch skills...

Gemini3d

Ionospheric fluid electrodynamic model

Install / Use

/learn @gemini3d/Gemini3d
About this skill

Quality Score

0/100

Supported Platforms

Gemini CLI

README

GEMINI

DOI ci macOS CI Windows CI oneapi-linux

The GEMINI model (Geospace Environment Model of Ion-Neutral Interactions) is a three-dimensional ionospheric fluid-electrodynamic model written (mostly) in object-oriented fortran (2008+ standard). GEMINI is used for various scientific studies including:

  • effects of auroras on the terrestrial ionosphere
  • natural hazard effects on the space environment
  • effects of ionospheric fluid instabilities on radio propagation

The detailed mathematical formulation of GEMINI is included in GEMINI-docs. A subroutine-level set of inline generated documentation describing functions of individual program units is given via source code comments which are rendered as webpages. GEMINI uses generalized orthogonal curvilinear coordinates and has been tested with dipole and Cartesian coordinates.

Generally, the Git main branch has the current development version and is the best place to start, while more thoroughly-tested releases happen regularly. Specific releases corresponding to published results are generally noted in the corresponding journal article.

Bug Reporting

The GEMINI development teams values input from our user community, particulary in the form of reporting of errors. These allow us to insure that the code functions properly for a wider range of conditions, platforms, and use cases than we are otherwise able to directly test.

Please open a GitHub Issue if you experience difficulty with GEMINI. Try to provide as much detail as possible so we can try to reproduce your error.

Platforms

Gemini is intended to be OS / CPU arch / platform / compiler agnostic. Operating system support includes: Linux, MacOS, and Windows. CPU arch support includes: Intel, AMD, ARM, IBM POWER, Cray and more. GEMINI can run on hardware ranging from a Raspberry Pi to laptop to a high-performance computing (HPC) cluster. Generally speaking one can run large 2D or modest resolution 3D simulations (less than 10 million grid points) on a quad-core workstation, with some patience.

For large 3D simulations (many tens-to-hundreds of millions of grid points), GEMINI is best run in a cluster environment or a very "large" multi-core workstation (e.g. 16 or more cores). Runtime depends heavily on the grid spacing used, which determines the time step needed to insure stability, For example we have found that a 20M grid point simulations takes about 4 hours on 72 Xeon E5 cores. 200M grid point simulations can take up to a week on 256 cores. It has generally been found that acceptable performance requires > 1GB memory per core; moreover, a large amount of storage (hundreds of GB to several TB) is needed to store results from large simulations.

Quick start

To build Gemini and run self-tests takes about 10 minutes on a laptop. Gemini3D uses several external libraries that are built as a required one-time procedure. Gemini3D works "offline" that is without internet once initially setup.

Requirements:

  • C, C++ and Fortran compiler. See compiler help for optional further details.
    • GCC ≥ 9 with OpenMPI or MPICH
    • Clang with OpenMPI
    • Intel oneAPI
    • Cray with GCC or Intel oneAPI backend
  • Python and/or MATLAB for scripting front- and back-ends
  • CMake: if your CMake is too old, download or python -m pip install cmake
  • MPI: any of OpenMPI, IntelMPI, MPICH, MS-MPI. See MPI help if needed. Without MPI, Gemini3D uses one CPU core only, which runs much more slowly than with MPI.

Gemini3D setup

Obtain the Gemini3D source code:

git clone --recurse-submodules https://github.com/gemini3d/gemini3d.git

Build the Gemini3D code

cd ./gemini3d

cmake -B build

cmake --build build --parallel

Non-default build options may be used. Gemini3d developer options allow things like array bounds checking.

GEMINI has self tests that compare the output from a "known" test problem to a reference output. To verify your GEMINI build, run the self-tests.

ctest --test-dir build

To retrieve Git updates from other developers do:

git pull

git submodule update --init --recursive

Offline HPC batch CTest

Note: some HPC systems only have internet when on a login node, but cannot run MPI simulations on the login node. Batch sessions, including interactive, may be offline. To run CTest in such an environment, download the data once from the login node:

ctest --test-dir build --preset download

then from an interactive batch session, run the tests:

ctest --test-dir build --preset offline

GEMINI Numerical Library Dependencies

For various numerical solutions Gemini relies on:

  • LAPACK
  • scalapack
  • MUMPS

For file input/output we also use:

Running GEMINI from a Shell Environment

For basic operations the GEMINI main program simply needs to be run from the command line with arguments corresponding to to the number of processes to be used for the simulation, the location where the input files are and where the output files are to be written:

mpiexec -np <number of processors>  build/gemini.bin <output directory>

for example:

mpiexec -np 4 build/gemini.bin ~/mysim3d/arecibo

GEMINI can also be run via scripting frontend of PyGemini python -m gemini3d.run -np, or the executable gemini3d.run. Development of gemini3d.run was funded by NASA NNH19ZDA001N-HDEE grant 80NSSC20K0176.

Advanced Command Line Options

By default, only the current simulation time and a few other messages are shown to keep logs uncluttered. gemini.bin command line options include:

-d | -debug : print verbosely -- could be 100s of megabytes of text on long simulation for advanced debugging.

-nooutput : do not write data to disk. This is for benchmarking file output time, as the simulation output is lost, so this option would rarely be used.

-manual_grid <# x2 images> <# x3 images> : forces the code to adopt a specific domain decomposition in x2 and x3 by using the integers given. If not specified the code will attempt to find its own x2,x3 decomposition. The number of grid points in x2 and x3 must be evenly divisible by the number of user-specified images in each direction, respectively.

-dryrun : only run the first time step, do not write any files. This can be useful to diagnose issues not seen in unit tests, particularly issues with gridding. It runs in a few seconds or less than a minute for larger sims, something that can be done before queuing an HPC job.

Running GEMINI through Scripting Environments

If you prefer to issue the GEMINI run command through a scripting environment you may do so (via python) in the following way:

  1. make a config.nml with desired parameters for an equilibrium sim.

  2. run the equilibrium sim:

    python -m gemini3d.run /path_to/config_eq.nml /path_to/sim_eq/
    
  3. create a new config.nml for the actual simulation and run

    python -m gemini3d.run /path_to/config.nml /path_to/sim_out/
    

Input file format

See Readme_input for details on how to prepare input data for GEMINI. Generally speaking there are python and MATLAB scripts available in the mat_gemini and pygemini repositories that will save data in the appropriate format once generated.

Loading and plotting output

GEMINI uses Python for essential interfaces, plotting and analysis. Matlab scripts relevant to Gemini to mat_gemini repo.

Only the essential scripts needed to setup a simple example, and plot the results are included in the main GEMINI repository. The Gemini-scripts and Gemini-examples contain scripts used for various published and ongoing analyses.

See Readme_output for a description of how to load the simulation output files and the different variable names, meanings, and units.

Computing Magnetic Field Perturbations

An auxiliary program, magcalc.f90, can be used to compute magnetic field perturbations from a complete disturbance simulation. See Readme_magcalc for a full description of how this program works.

List of other associated Readmes

  1. Readme_output - information about data included in the output files of a GEMINI simulation
  2. Readme_input - information on how input files should be prepared and formatted.
  3. Readme_compilers - details regarding various compilers
  4. Readme_cmake - cmake build options
  5. Readme_docs - information about model documentat

Related Skills

View on GitHub
GitHub Stars64
CategoryDevelopment
Updated12h ago
Forks34

Languages

Fortran

Security Score

100/100

Audited on Mar 30, 2026

No findings