Cogmit
š¤ Cogmit ā AI-powered Git commit message generator using local Ollama models. Generate smart, conventional commit messages instantly ā no cloud, no data leaks. āļø Fully local, customizable, fast, and privacy-first.
Install / Use
/learn @nicoaudy/CogmitREADME
š¤ cogmit
AI-powered Git commit message generator using local Ollama models
cogmit is a beautiful CLI tool that analyzes your Git changes and generates intelligent commit message suggestions using local AI models via Ollama. No data leaves your machine - everything runs locally!
⨠Features
- š§ Local AI: Uses Ollama for completely local AI processing
- šØ Beautiful UI: Interactive selection with arrow key navigation using Bubble Tea
- āļø Configurable: Customize model, host, and behavior settings
- š Privacy-First: All processing happens locally, no external API calls
- š Fast: Lightweight models like
llama3.2:1bfor quick responses - š Conventional Commits: Generates properly formatted commit messages
š Quick Start
Prerequisites
-
Ollama: Install and run Ollama locally
# Install Ollama (if not already installed) curl -fsSL https://ollama.ai/install.sh | sh # Pull a lightweight model ollama pull llama3.2:1b # Start Ollama (if not running) ollama serve -
Go: Go 1.19 or later
Installation
Option 1: One-liner install (Recommended)
curl -fsSL https://raw.githubusercontent.com/nicoaudy/cogmit/main/install.sh | bash
Option 2: Manual installation
# Download the binary for your platform
# Linux (amd64)
wget https://github.com/nicoaudy/cogmit/releases/latest/download/cogmit-linux-amd64
chmod +x cogmit-linux-amd64
sudo mv cogmit-linux-amd64 /usr/local/bin/cogmit
# macOS (Intel)
wget https://github.com/nicoaudy/cogmit/releases/latest/download/cogmit-darwin-amd64
chmod +x cogmit-darwin-amd64
sudo mv cogmit-darwin-amd64 /usr/local/bin/cogmit
# macOS (Apple Silicon)
wget https://github.com/nicoaudy/cogmit/releases/latest/download/cogmit-darwin-arm64
chmod +x cogmit-darwin-arm64
sudo mv cogmit-darwin-arm64 /usr/local/bin/cogmit
# Windows (amd64)
# Download cogmit-windows-amd64.exe and add to PATH
Option 3: Build from source
# Clone the repository
git clone https://github.com/nicoaudy/cogmit.git
cd cogmit
# Build the binary
go build -o cogmit .
# Install globally (optional)
sudo mv cogmit /usr/local/bin/
Setup
Configure cogmit with your preferences:
cogmit setup
This will prompt you for:
- Ollama host (default:
http://localhost:11434) - Model name (default:
llama3.2:1b) - Number of suggestions (default:
3) - Auto-commit behavior (default:
false)
š Usage
Basic Usage
# Generate commit messages for staged changes
cogmit
# Or explicitly use the generate command
cogmit generate
Workflow Example
# 1. Make some changes to your code
echo "console.log('Hello, World!');" > hello.js
# 2. Stage your changes
git add hello.js
# 3. Generate commit messages
cogmit
You'll see an interactive interface like this:
š Analyzing changes...
š Found staged changes
š¤ Generating commit messages using llama3.2:1b...
⨠Generated 3 commit message suggestions
š¤ Choose a commit message:
> feat: add hello world console log
fix: add missing console.log statement
chore: add hello.js file
ā/ā or k/j: navigate ⢠enter: select ⢠e: edit ⢠q: quit
Interactive Controls
- ā/ā or k/j: Navigate between options
- Enter: Select the highlighted option
- e: Edit the highlighted option (future feature)
- q: Quit without selecting
āļø Configuration
Configuration is stored in ~/.config/cogmit/config.json:
{
"model": "llama3.2:1b",
"ollama_host": "http://localhost:11434",
"num_suggestions": 3,
"auto_commit": false
}
Recommended Models
For the best balance of speed and quality:
llama3.2:1b- Fastest, good for simple changesllama3.2:3b- Better quality, still very fastllama3.2:7b- Best quality, slower but still reasonable
š ļø Development
Project Structure
cogmit/
āāā cmd/
ā āāā root.go # Main command entrypoint
ā āāā setup.go # Setup command
ā āāā generate.go # Generate command
āāā internal/
ā āāā config/ # Configuration management
ā āāā git/ # Git operations
ā āāā ai/ # Ollama API client
ā āāā ui/ # Bubble Tea UI components
āāā main.go # Application entrypoint
Building from Source
# Clone and build
git clone https://github.com/nicoaudy/cogmit.git
cd cogmit
go build -o cogmit .
# Run tests
go test ./...
# Run with debug logging
DEBUG=1 ./cogmit generate
š Troubleshooting
Common Issues
"Ollama API returned status 404"
- Make sure Ollama is running:
ollama serve - Check if the model exists:
ollama list - Pull the model:
ollama pull llama3.2:1b
"Not in a Git repository"
- Make sure you're in a Git repository
- Initialize one:
git init
"No changes found to commit"
- Stage some changes:
git add . - Or make some changes to your files
"Failed to connect to Ollama"
- Check if Ollama is running on the configured host
- Verify the host URL in your config
Debug Mode
Enable debug logging:
DEBUG=1 cogmit generate
This will create a debug.log file with detailed information.
š¤ Contributing
Contributions are welcome! Please feel free to submit a Pull Request.
- Fork the repository
- Create your feature branch (
git checkout -b feature/amazing-feature) - Commit your changes (
git commit -m 'Add some amazing feature') - Push to the branch (
git push origin feature/amazing-feature) - Open a Pull Request
š License
This project is licensed under the MIT License - see the LICENSE file for details.
š Acknowledgments
- Ollama for providing local AI capabilities
- Bubble Tea for the beautiful TUI framework
- Cobra for the CLI framework
- Lip Gloss for terminal styling
š® Future Ideas
- [ ] Support for multiple AI providers (OpenAI, Gemini, etc.)
- [ ]
--dry-runmode to preview without committing - [ ]
cogmit historyto review past generated commits - [ ]
cogmit config show/editcommands - [ ] Conventional commit mode enforcement
- [ ] Custom prompt templates
- [ ] Integration with popular Git hooks
Made with ā¤ļø for developers who love clean, meaningful commit messages
