Nntikz
A collection of TikZ diagrams of neural networks and deep learning concepts for academic use.
Install / Use
/learn @fraserlove/NntikzREADME
NNTikZ
A collection of TikZ diagrams for neural network and deep learning concepts. NNTikZ is designed to provide clean and consistent figures suitable for academic papers, lecture notes, theses, and presentations. All diagrams are open source, easy to customize, and written entirely in TikZ/LaTeX. Feel free to open an issue to suggest a diagram or submit a pull request with any contributions.
Some example diagrams are shown below:
| Diagram | Preview | |--------|---------| | Transformer | <img src="assets/transformer.png" alt="Transformer" width="300"/> | | Multi-Head Attention | <img src="assets/multihead_attention.png" alt="Multi-Head Attention" width="300"/> | | Neural Network | <img src="assets/neural_network.png" alt="Neural Network" width="300"/> | | Attention Mechanism | <img src="assets/attention.png" alt="Attention Mechanism" width="300"/> | | Gated Recurrent Unit (GRU) | <img src="assets/gru.png" alt="Gated Recurrent Unit" width="300"/> | | RNN Encoder–Decoder (Sutskever et al.) | <img src="assets/rnn_encoder_decoder_sutskever.png" alt="RNN Encoder-Decoder" width="300"/> | | Backpropagation Through Time (BPTT) | <img src="assets/rnn_backprop.png" alt="Backpropagation Through Time" width="300"/> |
Citation
If you use NNTikZ in your research or project, cite simply as:
@misc{nntikz,
author = {Fraser Love},
title = {NNTikZ: TikZ Diagrams for Deep Learning and Neural Networks},
year = {2024},
publisher = {GitHub},
url = {https://github.com/fraserlove/nntikz}
}
