SkillAgentSearch skills...

Faff

Drop the faff, dodge the judgment. Another bloody AI commit generator, but this one stays local ๐Ÿฆ™

Install / Use

/learn @wimpysworld/Faff

README

๐ŸŽฏ faff

Drop the faff, dodge the judgment, get back to coding.

Stop staring at that staged diff like it owes you money. We all know the drill: you've made brilliant changes, git knows exactly what happened, but translating that into a proper Conventional Commits 1.0.0 message feels like explaining your code to your pets ๐Ÿพ faff uses local LLMs via Ollama to automatically generate commit messages from your diffs โ€“ because your changes already tell the story, they just need a translator that speaks developer โ€๐Ÿง‘โ€๐Ÿ’ป

faff is a productivity tool for the mundane stuff, not a replacement for thoughtful communication.

โœจ Why faff?

We've all been there: you spend longer crafting the commit message than writing the actual code. "Was that a feat: or fix:?" you wonder, as your staged diff sits there perfectly describing everything while you faff about trying to translate it into prose.

You either end up with "Updated stuff" (again!) or some overwrought novel nobody will read. Meanwhile, cloud-based tools want to slurp up your "TODO: delete this abomination" comments and questionable variable names all while extracting money from your wallet ๐Ÿ’ธ

faff exists because your diffs already know what happened โ€“ they just need a local AI translator that follows conventional commits rules without the existential crisis. Drop the faff, dodge the judgment, get back to coding.

So yes, faff is another bloody AI commit generator. The Internet's already drowning in them, so here's another one to add to the deluge of "my first AI projects" ๐Ÿ’ง faff started as me having a poke around the Ollama API while thinking "surely we can do this locally without sending the content of our wallets to the vibe-coding dealers?" It's basically a learning project that accidentally became useful โ€“ like most of the best tools, really.

  • ๐Ÿค– AI-Powered: Uses local Ollama LLMs for "intelligent" commit message generation
  • ๐Ÿ“ Standards-Compliant: Follows Conventional Commits specification, most of the time if you're lucky
  • ๏ธ๐Ÿ•ต๏ธ Privacy-First: Runs entirely locally - your code never leaves your machine, until you push it to GitHub
  • ๐Ÿค Simple Setup: Auto-downloads models and handles all dependencies, except it doesn't - that was a marketing lie
  • ๐ŸŽจ Beautiful UX: Elegant progress indicators and interactive prompts, for a shell script

๐Ÿš€ Quick Start

Prerequisites

Install

Download faff, make it executable and put it somewhere in your $PATH.

curl -o faff.sh https://raw.githubusercontent.com/wimpysworld/faff/refs/heads/main/faff.sh
chmod +x faff.sh
sudo mv faff.sh /usr/local/bin/faff

Basic Usage

The standard workflow is stage some changes and let faff generate your commit message.

git add .
faff

That's it! faff will analyze your changes and generate a commit message.

<div align="center"><img alt="faff demo" src="assets/faff.gif" width="1024" /></div>

๐Ÿง  AI Models

I've mostly tested faff using the qwen2.5-coder family of models as they've worked best during my testing. Choose one based on your available VRAM or Unified memory:

| Model | VRAM | Speed | Quality | |------------------------|-------|-------|------------| | qwen2.5-coder:1.5b | ~1GB | โšกโšกโšกโšก | โญโญ | | qwen2.5-coder:3b | ~2GB | โšกโšกโšก | โญโญโญ | | qwen2.5-coder:7b | ~5GB | โšกโšกโšก | โญโญโญโญ | | qwen2.5-coder:14b | ~9GB | โšกโšก | โญโญโญโญโญ | | qwen2.5-coder:32b | ~20GB | โšก | โญโญโญโญโญ |

Any model supported by Ollama will work so feel free to experiment ๐Ÿงช Share your feedback and observations in the faff discussions ๏ธ๐Ÿ—จ๏ธ so we can all benefit.

Using a Custom Model

To use a specific model, just override the FAFF_MODEL environment variable.

FAFF_MODEL="qwen2.5-coder:3b" faff

Environment Variables

Customize faff's behavior through environment variables:

# Model selection (default: qwen2.5-coder:7b)
export FAFF_MODEL="qwen2.5-coder:14b"

# Ollama connection (defaults to http://localhost:11434)
export OLLAMA_HOST="your-ollama-server.com"
export OLLAMA_PORT="11434"
export OLLAMA_PROTOCOL="http"

# Optional API key for Ollama, if the API is protected
export OLLAMA_TOKEN="sk-ollama-kasdjfhlwekjfhlashjehasjfgsdejsj"

# API timeout in seconds (default: 180)
export FAFF_TIMEOUT=300

๐Ÿ™ Git Integration

Add helpful aliases to your ~/.gitconfig:

[alias]
    faff = "!faff"               # Generate commit with faff
    vibe = "!git add . && faff"  # Stage all and commit with faff

๐Ÿ”ง Commitlint Integration

Got a commitlint config in your project? Lovely. faff will automatically detect it and constrain the AI to only use your allowed scopes - no configuration required, no extra dependencies, just works.

If faff finds a .commitlintrc.json or commitlint.config.json in your repository, it extracts the scopes from rules.scope-enum and tells the LLM to stick to them. Your commits get the proper type(scope): description format without the AI going off-piste with invented scopes.

No commitlint config? No worries - faff carries on exactly as before.

Example Config

Here's a commitlint config that faff will pick up:

{
  "extends": ["@commitlint/config-conventional"],
  "rules": {
    "scope-enum": [2, "always", ["api", "cli", "docs", "tests"]]
  }
}

With this config, faff will only generate commits using api, cli, docs, or tests as scopes. Keeps everything tidy without you having to remember what scopes exist.

๐Ÿ›Ÿ Troubleshooting

Common Issues

โŒ "Ollama service is not running"

Start Ollama.

ollama serve

โŒ "No changes to commit"

Stage some changes first.

git add .

๐Ÿค Contributing

We welcome contributions! Whether you're fixing bugs, adding features, or improving documentation, your help makes faff better for everyone.

View on GitHub
GitHub Stars46
CategoryDevelopment
Updated1mo ago
Forks8

Languages

Shell

Security Score

95/100

Audited on Mar 3, 2026

No findings