SkillAgentSearch skills...

MatrixTransformer

MatrixTransfromer is a sophisticated mathematical utility class that enables transformation, manipulation, and analysis of matrices between different matrix types

Install / Use

/learn @fikayoAy/MatrixTransformer
About this skill

Quality Score

0/100

Supported Platforms

Universal

README

<!-- filepath: c:\Users\ayode\ConstantA\matrixTransfomer\README.md -->

MatrixTransformer: Engineering Intelligence Through Deterministic Matrix Relationships

Based on the paper: MatrixTransformer: A Unified Framework for Matrix Transformations
Read the full paper on Zenodo
Related project: QuantumAccel


Overview

MatrixTransformer is a deterministic AI framework that discovers and preserves structural relationships across high-dimensional data using mathematically grounded matrix operations rather than probabilistic approximations.

Contemporary AI often sacrifices transparency for performance, relying heavily on probabilistic approximations. MatrixTransformer reimagines intelligence: not as a simulation of understanding, but as a deterministic framework for discovering and preserving structural relationships across high-dimensional data.

"Not an AI to fear, but one you define."

Built on rigorous mathematical foundations, MatrixTransformer offers a lossless, structure-preserving, and explainable approach to AI and data modeling.


Installation

Requirements

Ensure you are using Python 3.8+ and have NumPy, SciPy, and optionally PyTorch installed.

Clone from github and Install from wheel file

git clone https://github.com/fikayoAy/MatrixTransformer.git
cd MatrixTransformer
pip install dist/matrixtransformer-0.1.0-py3-none-any.whl

GitHub stars GitHub forks

Install dependencies

pip install numpy scipy torch

Verify installation

import MatrixTransformer
print("MatrixTransformer installed successfully!")

Basic Usage

Initialize the transformer

import numpy as np
from MatrixTransformer import MatrixTransformer

# Create a transformer instance
transformer = MatrixTransformer()
# accessing the matrixes stored in the transformer
transformer.matrix
# accessing the matrix graph in the transformer
transformer.matrix_graph # contains 16 different matrix types see #matrixtransformer.py

Transform a matrix to a specific type

# Create a sample matrix
matrix = np.random.randn(4, 4)

# Transform to symmetric matrix
symmetric_matrix = transformer.process_rectangular_matrix(matrix, 'symmetric')

# Transform to positive definite
positive_def = transformer.process_rectangular_matrix(matrix, 'positive_definite')

Convert between tensors and matrices

# Convert a 3D tensor to a 2D matrix representation
tensor = np.random.randn(3, 4, 5)
matrix_2d, metadata = transformer.tensor_to_matrix(tensor)

# Convert back to the original tensor
reconstructed_tensor = transformer.matrix_to_tensor(matrix_2d, metadata)
# the bilaterial tensor conversion ops perserves meta-data for lossless data reconstuction

Combine matrices

# Combine two matrices using different strategies
matrix1 = np.random.randn(3, 3)
matrix2 = np.random.randn(3, 3)

# Weighted combination
combined = transformer.combine_matrices(
    matrix1, matrix2, mode='weighted', weight1=0.6, weight2=0.4
)

# Other combination modes
max_combined = transformer.combine_matrices(matrix1, matrix2, mode='max')
multiply_combined = transformer.combine_matrices(matrix1, matrix2, mode='multiply')

Add custom matrix types

def custom_magic_matrix_rule(matrix):
    """Transform a matrix to have 'magic square' properties."""
    n = matrix.shape[0]
    result = matrix.copy()
    target_sum = n * (n**2 + 1) / 2
    
    # Simplified implementation for demonstration
    # (For a real implementation, you would need proper balancing logic)
    row_sums = result.sum(axis=1)
    for i in range(n):
        result[i, :] *= (target_sum / max(row_sums[i], 1e-10))
    
    return result

# Add the new transformation rule
transformer.add_transform(
    matrix_type="magic_square",
    transform_rule=custom_magic_matrix_rule,
    properties={"equal_row_col_sums": True},
    neighbors=["diagonal", "symmetric"]
)

# Now use your custom transformation
magic_matrix = transformer.process_rectangular_matrix(matrix, 'magic_square')

Advanced Features

Hypercube decision space navigation

# Find optimal transformation path between matrix types
source_type = transformer._detect_matrix_type(matrix1)
target_type = 'positive_definite'
path, attention_scores = transformer._traverse_graph(matrix1, source_type=source_type)

# Apply path-based transformation
result = matrix1.copy()
for matrix_type in path:
    transform_method = transformer._get_transform_method(matrix_type)
    if transform_method:
        result = transform_method(result)

Documentation

Recent Updates

v0.1.1 - November 22, 2025

  • Added streaming/batch processing, memmap support, ANN option, and element-level metadata for find_hyperdimensional_connections (see CHANGELOG for details)

v0.1.0 - November 2025

  • Fixed angular tolerance bug in find_hyperdimensional_connections
  • Changed from dot-product-based to angle-based tolerance (1e-7 threshold was treating small non-zero angles as zero)
  • Improved distance metric calculations for hypersphere projections

See CHANGELOG.md for detailed technical information.


Related Projects

  • QuantumAccel: A quantum-inspired system built on MatrixTransformer's transformation logic, modeling coherence, flow dynamics, and structure-evolving computations.

Citations

Hyperdimensional Connection Method

@misc{ayodele2025hyperdimensional,
  title={Hyperdimensional connection method - A Lossless Framework Preserving Meaning, Structure, and Semantic Relationships across Modalities. (A MatrixTransformer subsidiary)},
  author={Ayodele, Fikayomi},
  year={2025},
  doi={10.5281/zenodo.16051260},
  url={https://doi.org/10.5281/zenodo.16051260}
}

MatrixTransformer Framework

@misc{ayodele2025matrixtransformer,
  title={MatrixTransformer: A Unified Framework for Matrix Transformations},
  author={Ayodele, Fikayomi},
  year={2025},
  doi={10.5281/zenodo.15867279},
  url={https://zenodo.org/records/15867279}
}

Contact & Collaboration

Research Collaboration: Ayodeleanjola4@gmail.com | 2273640@swansea.ac.uk

Contribution Guidelines:

  • Bug reports welcome via Issues
  • Feature requests encouraged
  • Documentation improvements appreciated

MatrixTransformer: Intelligence, engineered — not hallucinated.

Paper Links:
Hyperdimensional Connection Method
MatrixTransformer Framework

View on GitHub
GitHub Stars18
CategoryDevelopment
Updated2mo ago
Forks4

Languages

Python

Security Score

75/100

Audited on Jan 24, 2026

No findings