Gomlx
GoMLX: An Accelerated Machine Learning Framework For Go
Install / Use
/learn @gomlx/GomlxREADME
GoMLX, an Accelerated ML and Math Framework
📖 About GoMLX
<img align="right" src="docs/gomlx_gopher2.png" alt="GoMLX Gopher" width="220px"/>GoMLX is an easy-to-use set of Machine Learning and generic math libraries and tools. It can be seen as a PyTorch/Jax/TensorFlow for Go.
It can be used to train, fine-tune, modify, and combine machine learning models. It provides all the tools to make that work easy: from a complete set of differentiable operators, all the way to UItools to plot metrics while training in a notebook.
It runs almost everywhere Go runs, using a pure Go backend. It runs even in the browser with WASM (see demo created with GoMLX). Likely, it will work in embedded devices as well (see Tamago).
It also supports a very optimized backend engine based on OpenXLA that uses just-in-time compilation to CPU, GPUs (Nvidia, and likely AMD ROCm, Intel, Macs) and Google's TPUs. It also supports modern distributed execution (new, still being actively improved) for multi-TPU or multi-GPU using XLA Shardy, an evolution of the GSPMD distribution).
It's the same engine that powers Google's Jax, TensorFlow and Pytorch/XLA, and it has the same speed in many cases. Use this backend to train large models or with large datasets.
<div> <p>It was developed to be a full-featured ML platform for Go, productionizable and easily to experiment with ML ideas —see Long-Term Goals below.</p>[!Tip]
- See our 🎓 tutorial 🎓
- See Eli Bendersky's blog post "GoMLX: ML in Go without Python", (a bit outdated, but still useful)
- A guided example for Kaggle Dogs Vs Cats.
- A simple GoMLX slide deck with small sample code.
It strives to be simple to read and reason about, leading the user to a correct and transparent mental model of what is going on (no surprises)—aligned with Go philosophy. At the cost of more typing (more verbose) at times.
It is also incredibly flexible and easy to extend and try non-conventional ideas: use it to experiment with new optimizer ideas, complex regularizers, funky multitasking, etc.
Documentation is kept up to date (if it is not well-documented, it is as if the code is not there), and error messages are useful (always with a stack-trace) and try to make it easy to solve issues.
</div>🗺️ Overview
GoMLX is a full-featured ML framework, supporting various well-known ML components
from the bottom to the top of the stack. But it is still only a slice of what a major ML library/framework should provide
(like TensorFlow, Jax, or PyTorch).
Examples developed using GoMLX:
- 🚀 NEW 🚀 GPT-2: Demonstrates text generation using the the new (experimental) transformer and generator packages.
- 🚀 NEW 🚀 Gemma 3 270M: Demonstrates ONNX-converted
text generation (LLM) using the onnx-community/gemma-3-270m-it-ONNX
model with GoMLX.
It uses the
gomlx/onnx-gomlxpackage to convert the model, andgomlx/go-huggingfaceto download the model and run the tokenizer. - 🚀 NEW 🚀 BERT-base-NER: A BERT-base model fine-tuned for Named Entity Recognition. It's also a ONNX-converted model from dslim/bert-base-NER model from HuggingFace.
- 🚀 NEW 🚀 MixedBread Reranker v1: A cross-encoder reranking
example, see HuggingFace MixedBread Reranker v1 page.
It uses the
gomlx/onnx-gomlxpackage to convert the model, andgomlx/go-huggingfaceto download the model and run the tokenizer.
- Adult/Census model;
- How do KANs learn ?;
- Cifar-10 demo;
- MNIST demo (library and command-line only)
- Dogs & Cats classifier demo;
- IMDB Movie Review demo;
- Diffusion model for Oxford Flowers 102 dataset (generates random flowers);
- Flow Matching Study Notebook based on Meta's "Flow Matching Guide and Code".
- GNN model for OGBN-MAG (experimental).
- Last, a trivial synthetic linear model, for those curious to see a barebones simple model.
- Neural Style Transfer 10-year Celebration: see a demo written using GoMLX of the original paper.
- Triplet Losses: various negative sampling strategies as well as various distance metrics.
- AlphaZero AI for the game of Hive: it uses a trivial GNN to evaluate positions on the board. It includes a WASM demo (runs GoMLX in the browser!) and a command-line UI to test your skills!
Highlights:
🚀 NEW 🚀 go-coreml: Go bindings to Apple's CoreML, supporting Metal acceleration.
- Converting ONNX models to GoMLX with onnx-gomlx: both as an alternative for
onnxruntime(leveraging XLA), but also to further fine-tune models. See also go-huggingface to easily download ONNX model files from HuggingFace. - Docker "gomlx_jupyterlab" with integrated JupyterLab and GoNB (a Go kernel for Jupyter notebooks)
- Three backends:
xla: OpenXLA backend for CPUs, GPUs, and TPUs. State-of-the-art as these things go. For linux/amd64, linux/arm64 (CPU) and darwin/arm64 (CPU) for now. Using the go-xla Go version of the APIs.go: a pure Go backend (no C/C++ dependencies): slower but very portable (compiles to WASM/Windows/etc.):- SIMD support is underway (see SIMD for Go and under-development go-highway);
- 🚀 NEW 🚀: added support for some fused operations and for some types of quantization, greatly improving performance in some cases.
- See also GoMLX compiled to WASM to power the AI for a game of Hive
- 🚀 NEW 🚀 go-coreml: Go bindings to Apple's CoreML, supporting Metal acceleration.
- Autodiff: automatic differentiation—only gradients for now, no jacobian.
- Context: automatic variable management for ML models.
- ML layers library with some of the most popular machine learning "layers": FFN layers,
various activation functions, layer and batch normalization, convolutions, pooling, dropout, Multi-Head-Attention (for transformer layers), LSTM, KAN (B-Splines, GR-KAN/KAT networks, Discrete-KAN, PiecewiseLinear KAN), PiecewiseLinear
