NeuNet
Implementation of feedforward-backpropagated Neural Network from Scratch
Install / Use
/learn @Abhinavexists/NeuNetREADME
NeuNet
<div align="center"><A complete neural network built from scratch with NumPy>
Documentation • Quick Start • Features
</div>Overview
NeuNet is a comprehensive neural network implemented entirely from scratch using NumPy. This project demonstrates the fundamental concepts of deep learning.
Quick Start
Installation
git clone https://github.com/Abhinavexists/NeuNet.git
cd NeuNet
pip install numpy matplotlib plotly networkx pandas
Basic Usage
from src.models.neural_network import NeuralNetwork
from src.layers.core import Dense
from src.layers.activations import ReLU, Softmax
from src.layers.regularization import BatchNormalization, Dropout
from src.layers.losses import CategoricalCrossentropy
from src.layers.dataset import create_data
# Create synthetic dataset
X, Y = create_data(samples=100, classes=3, plot=True)
# Build the network
model = NeuralNetwork()
model.add(Dense(2, 128, learning_rate=0.002, optimizer='adam'))
model.add(BatchNormalization())
model.add(ReLU())
model.add(Dropout(0.1))
model.add(Dense(128, 64, learning_rate=0.002, optimizer='adam'))
model.add(ReLU())
model.add(Dense(64, 3, learning_rate=0.002, optimizer='adam'))
model.add(Softmax())
# Set loss function with regularization
model.set_loss(CategoricalCrossentropy(regularization_l2=0.0001))
# Train with advanced features
model.train(X, Y, epochs=500, batch_size=32, patience=30, verbose=True)
# Evaluate
predictions = model.predict(X)
accuracy = model.predict_proba(X)
print(f"Final Accuracy: {accuracy:.4f}")
Features
Core Components
- Dense Layers: Fully connected layers with He initialization
- Activation Functions: ReLU, LeakyReLU, Sigmoid, Tanh, Softmax
- Loss Functions: Categorical crossentropy with L1/L2 regularization
- Optimizers: SGD with momentum, Adam optimizer with bias correction
- Regularization: Batch normalization, dropout, weight penalties
Performance
- High Accuracy: 95%+ on synthetic datasets
- Fast Training: 3-5x speed improvement with Adam optimizer
- Robust: Early stopping prevents overfitting
- Stable: Numerical stability built into all operations
Documentation
Our comprehensive documentation is available at abhinavexists.github.io/NeuNet
Documentation Sections
- Home: Project overview and quick navigation
- Implementation Guide: Detailed technical walkthrough
- Development Timeline: Evolution of the project
- Reference: Complete component documentation
Advanced Examples
Multi-layer Classification Network
# Sophisticated architecture with all features
model = NeuralNetwork()
# Input processing with normalization
model.add(Dense(2, 128, learning_rate=0.002, optimizer='adam'))
model.add(BatchNormalization())
model.add(ReLU())
model.add(Dropout(0.1))
# Deep feature extraction
model.add(Dense(128, 64, learning_rate=0.002, optimizer='adam'))
model.add(BatchNormalization())
model.add(ReLU())
model.add(Dropout(0.1))
# Final classification
model.add(Dense(64, 32, learning_rate=0.002, optimizer='adam'))
model.add(ReLU())
model.add(Dense(32, 3, learning_rate=0.002, optimizer='adam'))
model.add(Softmax())
# Advanced loss with regularization
model.set_loss(CategoricalCrossentropy(regularization_l2=0.0001))
# Professional training setup
model.train(X, Y, epochs=500, batch_size=32, patience=30, verbose=True)
Comprehensive Evaluation
from src.utils.metrics import calculate_accuracy, confusion_matrix, precision_recall_f1
# Get predictions
predictions = model.predict(X_test)
probabilities = model.predict_proba(X_test)
# Calculate metrics
accuracy = calculate_accuracy(y_true, probabilities)
cm = confusion_matrix(y_true, predictions, num_classes=3)
precision, recall, f1 = precision_recall_f1(y_true, predictions, num_classes=3)
print(f"Accuracy: {accuracy:.4f}")
print(f"Precision: {precision}")
print(f"Recall: {recall}")
print(f"F1-score: {f1}")
Interactive Visualization
from src.utils.network_data import export_network
from src.utils.Visualization import network_visualization
# Export network structure
dense_layers = [layer for layer in model.layers if hasattr(layer, 'weights')]
export_network(*dense_layers[:4])
# Create interactive visualization
fig = network_visualization("src/utils/network_data.json")
# Opens interactive HTML visualization
Contributing
We welcome contributions! Please see CONTRIBUTING.md for guidelines.
Performance Metrics
| Metric | Value | |--------|-------| | Classification Accuracy | 95%+ | | Training Speed (vs SGD) | 3-5x faster | | Components | 15+ core modules | | Activation Functions | 5 implemented | | Optimization Algorithms | 2 advanced | | Regularization Techniques | 3 methods |
License
This project is licensed under the MIT License - see the LICENSE file for details.
Related Skills
node-connect
353.3kDiagnose OpenClaw node connection and pairing failures for Android, iOS, and macOS companion apps
claude-opus-4-5-migration
111.7kMigrate prompts and code from Claude Sonnet 4.0, Sonnet 4.5, or Opus 4.1 to Opus 4.5
frontend-design
111.7kCreate distinctive, production-grade frontend interfaces with high design quality. Use this skill when the user asks to build web components, pages, or applications. Generates creative, polished code that avoids generic AI aesthetics.
model-usage
353.3kUse CodexBar CLI local cost usage to summarize per-model usage for Codex or Claude, including the current (most recent) model or a full model breakdown. Trigger when asked for model-level usage/cost data from codexbar, or when you need a scriptable per-model summary from codexbar cost JSON.
