SkillAgentSearch skills...

HebbianCNNPyTorch

Automatic Hebbian learning in multi-layer convolutional networks with PyTorch, by expressing Hebbian plasticity rules as gradients

Install / Use

/learn @ThomasMiconi/HebbianCNNPyTorch
About this skill

Quality Score

0/100

Supported Platforms

Universal

README

HebbianCNNPyTorch

This code demonstrate a very easy way to implement Hebbian learning in multi-layer convolutional networks with PyTorch (or other deep learning frameworks with automatic differentiation): just define a special loss whose gradient is equal to the Hebbian update.

Ready-made expressions are available for plain Hebb's rule (dw ~= xy), Grossberg's Instar rule (dw ~= y(x-w)) and Oja's rule, (dw ~= y(x-wy)). All code is available as Jupyter notebooks with PyTorch, ready to use on Google Colab.

  • The simple version contains a simple but fully functional implementation. It is highly recommended to look at this version first.

  • The full version contains the code that generates the actual results described in the paper. It is more complex, mostly because it has a lot more options.

  • The checking code just verifies that the PyTorch-generated gradients are equal to (hand-computed) Hebbian updates, for the various rules.

For more details, see our preprint at https://arxiv.org/abs/2107.01729.

Related Skills

View on GitHub
GitHub Stars40
CategoryEducation
Updated1mo ago
Forks5

Languages

Jupyter Notebook

Security Score

75/100

Audited on Feb 19, 2026

No findings