Tsgm
Generation and evaluation of synthetic time series datasets (also, augmentations, visualizations, a collection of popular datasets) NeurIPS'24
Install / Use
/learn @AlexanderVNikitin/TsgmREADME
:jigsaw: Get Started
TSGM is an open-source framework for synthetic time series dataset generation and evaluation.
<div align="center"> <img src="./docs/_static/generation_process.gif"> </div>The framework can be used for creating synthetic datasets (see <a href="#hammer-generators">:hammer: Generators </a>), augmenting time series data (see <a href="#art-augmentations">:art: Augmentations </a>), evaluating synthetic data with respect to consistency, privacy, downstream performance, and more (see <a href="#chart_with_upwards_trend-metrics">:chart_with_upwards_trend: Metrics </a>), using common time series datasets (TSGM provides easy access to more than 140 datasets, see <a href="#floppy_disk-datasets">:floppy_disk: Datasets </a>).
We provide:
- Documentation with a complete overview of the implemented methods,
- Tutorials that describe practical use-cases of the framework.
Install TSGM
TSGM now supports Keras 3 with multiple backend options. Choose one of the following installation methods:
Option 1: With TensorFlow backend (default)
pip install tsgm[tensorflow]
Option 2: With PyTorch backend
pip install tsgm[torch]
Option 3: With JAX backend
pip install tsgm[jax]
Option 4: With all backends
pip install tsgm[all]
Option 5: Basic installation (you'll need to install a backend separately)
pip install tsgm
# Then install your preferred backend:
# For TensorFlow: pip install tensorflow tensorflow-probability
# For PyTorch: pip install torch torchvision
# For JAX: pip install jax jaxlib
Backend Configuration
Set your preferred Keras backend using the environment variable:
# For TensorFlow backend
export KERAS_BACKEND=tensorflow
# For PyTorch backend
export KERAS_BACKEND=torch
# For JAX backend
export KERAS_BACKEND=jax
M1 and M2 chips:
To install tsgm on Apple M1 and M2 chips:
# Install with TensorFlow backend
pip install tsgm[tensorflow]
# Or install with PyTorch backend
pip install tsgm[torch]
# Or install with JAX backend (excellent performance on M1/M2)
pip install tsgm[jax]
Note for PyTorch users on M1/M2 chips: Some operations may need CPU fallback on MPS devices. If you encounter MPS-related errors, set the environment variable:
export PYTORCH_ENABLE_MPS_FALLBACK=1
Note for JAX users: JAX provides excellent performance on M1/M2 chips and supports GPU acceleration. For optimal performance, consider installing JAX with Metal support:
pip install -U "jax[metal]"
Train your generative model
import tsgm
# ... Define hyperparameters ...
# dataset is a tensor of shape n_samples x seq_len x feature_dim
# Zoo contains several prebuilt architectures: we choose a conditional GAN architecture
architecture = tsgm.models.architectures.zoo["cgan_base_c4_l1"](
seq_len=seq_len, feat_dim=feature_dim,
latent_dim=latent_dim, output_dim=0)
discriminator, generator = architecture.discriminator, architecture.generator
# Initialize GAN object with selected discriminator and generator
gan = tsgm.models.cgan.GAN(
discriminator=discriminator, generator=generator, latent_dim=latent_dim
)
gan.compile(
d_optimizer=keras.optimizers.Adam(learning_rate=0.0003),
g_optimizer=keras.optimizers.Adam(learning_rate=0.0003),
loss_fn=keras.losses.BinaryCrossentropy(from_logits=True),
)
gan.fit(dataset, epochs=N_EPOCHS)
# Generate 100 synthetic samples
result = gan.generate(100)
:anchor: Tutorials
-
Introductory Tutorial Getting started with TSGM
-
Tutorial Datasets in TSGM
-
Tutorial Time Series Augmentations
-
Tutorial Time Series Generation with VAEs
-
Tutorial Model Selection
-
Tutorial Multiple GPUs or TPU with TSGM
For more examples, see our tutorials.
:art: Augmentations
TSGM provides a number of time series augmentations.
| Augmentation | Class in TSGM | Reference |
| ------------- | ------------- | ------------- |
| Gaussian Noise / Jittering | tsgm.augmentations.GaussianNoise | - |
| Slice-And-Shuffle | tsgm.augmentations.SliceAndShuffle | - |
| Shuffle Features | tsgm.augmentations.Shuffle | - |
| Magnitude Warping | tsgm.augmentations.MagnitudeWarping | Data Augmentation of Wearable Sensor Data for Parkinson’s Disease Monitoring using Convolutional Neural Networks |
| Window Warping | tsgm.augmentations.WindowWarping | Data Augmentation for Time Series Classification using Convolutional Neural Networks |
| DTW Barycentric Averaging | tsgm.augmentations.DTWBarycentricAveraging | A global averaging method for dynamic time warping, with applications to clustering. |
:hammer: Generators
TSGM implements several generative models for synthetic time series data.
| Method | Link to docs | Type | Notes |
| ------------- | ------------- | ------------- | ------------- |
| Structural Time Series | sts.STS | Data-driven | Great for modeling time series when prior knowledge is available (e.g., trend or seasonality). |
| GAN | GAN | Data-driven | A generic implementation of GAN for time series generation. It can be customized with architectures for generators and discriminators. |
| WaveGAN | GAN | Data-driven | WaveGAN is the model for audio synthesis proposed in Adversarial Audio Synthesis. To use WaveGAN, set use_wgan=True when initializing the GAN class and use the zoo["wavegan"] architecture from the model zoo. |
| ConditionalGAN | ConditionalGAN | Data-
