SkillAgentSearch skills...

NiaAML

A Python AutoML framework that automatically composes and optimizes machine-learning pipelines using nature-inspired algorithms.

Install / Use

/learn @firefly-cpp/NiaAML

README

<p align="center"><img src=".github/images/niaaml_logo.png" alt="NiaAML" title="NiaAML"/></p> <h1 align="center"> 🌳 NiaAML </h1> <p align="center"> <img alt="PyPI Version" src="https://img.shields.io/pypi/v/niaaml.svg" href="https://pypi.python.org/pypi/niaaml"> <img alt="PyPI - Python Version" src="https://img.shields.io/pypi/pyversions/niaaml.svg"> <img alt="PyPI - Downloads" src="https://img.shields.io/pypi/dm/niaaml.svg" href="https://pepy.tech/project/niaaml"> <a href="https://repology.org/project/python:niaaml/versions"> <img src="https://repology.org/badge/tiny-repos/python:niaaml.svg" alt="Packaging status"> </a> <img alt="Downloads" src="https://pepy.tech/badge/niaaml"> <img alt="GitHub license" src="https://img.shields.io/github/license/lukapecnik/niaaml.svg" href="https://github.com/lukapecnik/niaaml/blob/master/LICENSE"> <img alt="build" src="https://github.com/lukapecnik/niaaml/actions/workflows/test.yml/badge.svg"> <img alt="Coverage Status" src="https://coveralls.io/repos/github/lukapecnik/NiaAML/badge.svg?branch=master" href="https://coveralls.io/github/lukapecnik/NiaAML?branch=master"> <img alt="Documentation Status" src="https://readthedocs.org/projects/niaaml/badge/?version=latest" href="https://niaaml.readthedocs.io/en/latest/?badge=latest"> </p> <p align="center"> <img alt="GitHub commit activity" src="https://img.shields.io/github/commit-activity/w/lukapecnik/niaaml.svg"> <img alt="Average time to resolve an issue" src="http://isitmaintained.com/badge/resolution/lukapecnik/niaaml.svg" href="http://isitmaintained.com/project/lukapecnik/niaaml"> <img alt="Percentage of issues still open" src="http://isitmaintained.com/badge/open/lukapecnik/niaaml.svg" href="http://isitmaintained.com/project/lukapecnik/niaaml"> <img alt="GitHub contributors" src="https://img.shields.io/github/contributors/lukapecnik/niaaml.svg"> </p> <p align="center"> <img alt="DOI" src="https://zenodo.org/badge/289322337.svg" href="https://zenodo.org/badge/latestdoi/289322337"> <img alt="DOI" src="https://joss.theoj.org/papers/10.21105/joss.02949/status.svg" href="https://doi.org/10.21105/joss.02949"> </p> <p align="center"> <a href="https://github.com/firefly-cpp/NiaAML?tab=readme-ov-file#-installation">📦 Installation</a> • <a href="https://github.com/firefly-cpp/NiaAML?tab=readme-ov-file#-graphical-user-interface">💻 Graphical User Interface</a> • <a href="https://github.com/firefly-cpp/NiaAML?tab=readme-ov-file#-command-line-interface">🧑‍💻 Command Line Interface</a> • <a href="https://github.com/firefly-cpp/NiaAML?tab=readme-ov-file#-api">📮 API</a> • <a href="https://github.com/firefly-cpp/NiaAML?tab=readme-ov-file#-implemented-components">✨ Implemented Components</a> • <a href="https://github.com/firefly-cpp/NiaAML?tab=readme-ov-file#-optimization-process-and-parameter-tuning">💪 Optimization Process And Parameter Tuning</a> • <a href="https://github.com/firefly-cpp/NiaAML?tab=readme-ov-file#-examples">📓 Examples</a> • <a href="https://github.com/firefly-cpp/NiaAML?tab=readme-ov-file#-contributors">🫂 Contributors</a> • <a href="https://github.com/firefly-cpp/NiaAML?tab=readme-ov-file#-support">🙏 Support</a> • <a href="https://github.com/firefly-cpp/NiaAML?tab=readme-ov-file#-license">🔑 License</a> • <a href="https://github.com/firefly-cpp/NiaAML?tab=readme-ov-file#-cite-us">📄 Cite Us</a> </p>

NiaAML is a framework for Automated Machine Learning based on nature-inspired algorithms for optimization. The framework is written fully in Python. The name NiaAML comes from the Automated Machine Learning method of the same name [1]. Its goal is to compose the best possible classification pipeline for the given task efficiently using components on the input. The components are divided into three groups: feature selection algorithms, feature transformation algorithms and classifiers. The framework uses nature-inspired algorithms for optimization to choose the best set of components for the classification pipeline, and optimize their hyperparameters. We use the <a href="https://github.com/NiaOrg/NiaPy">NiaPy framework</a> for the optimization process, which is a popular Python collection of nature-inspired algorithms. The NiaAML framework is easy to use and customize or expand to suit your needs.

🆕📈 NiaAML now also support regression tasks. The package still refers to regressors as "classifiers" to avoid introducing a breaking change to the API.

The NiaAML framework allows you not only to run full pipeline optimization, but also to separate implemented components such as classifiers, feature selection algorithms, etc. It supports numerical and categorical features as well as missing values in datasets.

  • Free software: MIT license,
  • Documentation: https://niaaml.readthedocs.io/en/latest/,
  • Python versions: 3.9 | 3.10 | 3.11
  • Dependencies: click,
  • Tested OS: Windows, Ubuntu, Fedora, Linux Mint and CentOS. However, that does not mean it does not work on others.

NiaAML Architecture


📦 Installation

pip3

Install NiaAML with pip3:

pip3 install niaaml

In case you would like to try out the latest pre-release version of the framework, install it using:

pip3 install niaaml --pre

Fedora Linux

To install NiaAML on Fedora, use:

$ dnf install python-niaaml

Alpine Linux

To install NiaAML on Alpine Linux, please enable Community repository and use:

$ apk add py3-niaaml

Arch Linux

To install NiaAML on Arch Linux, use:

$ yay -Syyu python-niaaml

Nix

To install NiaAML with the Nix package manager, use:

$ nix-env -i python311Packages.niaaml

To enter a shell with the package already installed, use:

$ nix-shell -p python311Packages.niaaml

💻 Graphical User Interface

There is a simple Graphical User Interface for the NiaAML package available here.

🧑‍💻 Command Line Interface

We also provide a CLI for quick pipeline optimizations and inference from the terminal without the need to write custom scripts.

When you install the package as instructed above, you will already have access to the niaaml command with sub-commands optimize and infer

For usage information, add the --help flag:

niaaml help

niaaml infer help

An example Invocation of optimize:

niaaml optimize example

📮 API

There is a simple API for remote work with NiaAML package available here.

✨ Implemented Components

Click here for a list of currently implemented components divided into groups: classifiers, feature selection algorithms and feature transformation algorithms. At the end you can also see a list of currently implemented fitness functions for the optimization process, categorical features' encoders, and missing values' imputers. All of the components are passed into the optimization process using their class names. Let's say we want to choose between Adaptive Boosting, Bagging and Multi Layer Perceptron classifiers, Select K Best and Select Percentile feature selection algorithms and Normalizer as the feature transformation algorithm (may not be selected during the optimization process).

PipelineOptimizer(
    data=...,
    classifiers=['AdaBoost', 'Bagging', 'MultiLayerPerceptron'],
    feature_selection_algorithms=['SelectKBest', 'SelectPercentile'],
    feature_transform_algorithms=['Normalizer']
)

The argument of the PipelineOptimizer categorical_features_encoder is None by default. If your dataset contains any categorical features, you need to specify an encoder to use. The same goes for imputer and features that contain missing values.

PipelineOptimizer(
    data=...,
    classifiers=['AdaBoost', 'Bagging', 'MultiLayerPerceptron'],
    feature_selection_algorithms=['SelectKBest', 'SelectPercentile'],
    feature_transform_algorithms=['Normalizer'],
    categorical_features_encoder='OneHotEncoder',
    imputer='SimpleImputer'
)

For a full example see the 📓 Examples section.

💪 Optimization Process And Parameter Tuning

In the modifier version of NiaAML optimization process there are two types of optimization. The goal of the first type is to find an optimal set of components (feature selection algorithm, feature transformation algorithm and classifier). The next step is to find optimal parameters for the selected set of components, and that is the goal of the second type of optimization. Each component has an attribute _params, which is a dictionary of parameters and their possible values.

self._params = dict(
    n_estimators = ParameterDefinition(MinMax(min=10, max=111), np.uint)
)

An individual in the first type of optimization is represented as a real-valued vector that has a size equal to the sum of the number of keys in all three dictionaries (classifier's _params, Feature Transformation algorithm's _params and feature selection algorithm's _params) and the value of each dimension is in the range [0.0, 1.0]. The second type of optimization maps real values from the individual's vector to those parameter definitions in the dictionaries. Each parameter's value can be defined as a range or array of values. In the first case, a value from a vector is mapped from one iterval to another, and in the second case, a value from the vector falls into one of the bins that represent an index of the array that holds possible parameters` values.

Let's say we

View on GitHub
GitHub Stars34
CategoryEducation
Updated20d ago
Forks12

Languages

Python

Security Score

95/100

Audited on Mar 12, 2026

No findings