ActionAI
Real-Time Spatio-Temporally Localized Activity Detection by Tracking Body Keypoints
Install / Use
/learn @smellslikeml/ActionAIREADME
ActionAI 🤸
ActionAI is a python library for training machine learning models to classify human action. It is a generalization of our yoga smart personal trainer, which is included in this repo as an example.
<p align="center"> <img src="https://github.com/smellslikeml/ActionAI/blob/master/assets/ActionAI_main.gif"> </p>Getting Started
These instructions will show how to prepare your image data, train a model, and deploy the model to classify human action from image samples. See deployment for notes on how to deploy the project on a live stream.
Installation
Add the smellslikeml PPA and install with the following:
sudo add-apt-repository ppa:smellslikeml/ppa
sudo apt update
# Install with:
sudo apt-get install actionai
Make sure to configure the working directory with:
actionai configure
Using the CLI
Organize your training data in subdirectories like the example below. The actionai cli will automatically create a dataset from subdirectories of videos where each subdirectory is a category label.
.
└── dataset/
├── category_1/
│ └── *.mp4
├── category_2/
│ └── *.mp4
├── category_3/
│ └── *.mp4
└── ...
Then you can train a model with:
actionai train --data=/path/to/your/data/dir --model=/path/to/your/model/dir
And then run inference on a video with:
actionai predict --model=/path/to/your/model/dir --video=/path/to/your/video.mp4
View the default config.ini file included in this branch for additional configurations. You can pass your own config file using the --cfg flag.
Contributing
Please read CONTRIBUTING.md for details on our code of conduct, and the process for submitting pull requests to us.
License
This project is licensed under the GNU General Public License v3.0 - see the LICENSE.md file for details
References
Related Skills
YC-Killer
2.7kA library of enterprise-grade AI agents designed to democratize artificial intelligence and provide free, open-source alternatives to overvalued Y Combinator startups. If you are excited about democratizing AI access & AI agents, please star ⭐️ this repository and use the link in the readme to join our open source AI research team.
openclaw-plugin-loom
Loom Learning Graph Skill This skill guides agents on how to use the Loom plugin to build and expand a learning graph over time. Purpose - Help users navigate learning paths (e.g., Nix, German)
best-practices-researcher
The most comprehensive Claude Code skills registry | Web Search: https://skills-registry-web.vercel.app
groundhog
398Groundhog's primary purpose is to teach people how Cursor and all these other coding agents work under the hood. If you understand how these coding assistants work from first principles, then you can drive these tools harder (or perhaps make your own!).
