ExplainableFL
ExplainableFL is a PIP Python package designed to bring explainability to Federated Learning models using SHAP values. It provides easy-to-use methods to visualize the impact of model features and privacy mechanisms on model performance.
Install / Use
/learn @Ratheshan03/ExplainableFLREADME
ExplainableFL
ExplainableFL is a Python package designed to bring explainability to Federated Learning models using SHAP values. It provides easy-to-use methods to visualize the impact of model features and privacy mechanisms on model performance.
Installation
To install ExplainableFL, run the following command:
pip install -i https://test.pypi.org/simple/ explainablefl
Ensure you have the necessary prerequisites installed, including Python 3.6+ and pip
Usage
Here's a quick example to get you started:
from torch.utils.data import DataLoader, TensorDataset
from explainablefl import FederatedXAI
# Example model
model = torch.nn.Linear(10, 2)
# Setup DataLoader
x = torch.randn(100, 10)
y = torch.randint(0, 2, (100,))
dataset = TensorDataset(x, y)
data_loader = DataLoader(dataset, batch_size=10)
# Initialize FederatedXAI
federated_xai = FederatedXAI(device=torch.device('cpu'), global_model=model, data_loader=data_loader)
# Use the library to explain the client model
shap_plot_buf, _ = federated_xai.explain_client_model(model)
Features
- Client Model Explanation: Visualize how individual features influence model predictions.
- Global Model Explanation: Generate SHAP explanations and confusion matrices.
- Aggregation Impact: Assess the effect of model aggregation in federated settings.
- Privacy Impact Analysis: Understand the influence of differential privacy mechanisms.
Contributing
Contributions are welcome! Please feel free to submit pull requests or open issues to improve the documentation, code quality, or add new features.
Next updates
- Extended functionality to support additional frameworks beyond PyTorch.
- Modify explainable functions to generate more interactive visualizations.
- Improve error handling.
- Testing CI/CD pipelines.
License
This project is licensed under the MIT License - see the LICENSE file for details.
Related Skills
claude-opus-4-5-migration
106.4kMigrate prompts and code from Claude Sonnet 4.0, Sonnet 4.5, or Opus 4.1 to Opus 4.5
model-usage
345.9kUse CodexBar CLI local cost usage to summarize per-model usage for Codex or Claude, including the current (most recent) model or a full model breakdown. Trigger when asked for model-level usage/cost data from codexbar, or when you need a scriptable per-model summary from codexbar cost JSON.
TrendRadar
50.6k⭐AI-driven public opinion & trend monitor with multi-platform aggregation, RSS, and smart alerts.🎯 告别信息过载,你的 AI 舆情监控助手与热点筛选工具!聚合多平台热点 + RSS 订阅,支持关键词精准筛选。AI 智能筛选新闻 + AI 翻译 + AI 分析简报直推手机,也支持接入 MCP 架构,赋能 AI 自然语言对话分析、情感洞察与趋势预测等。支持 Docker ,数据本地/云端自持。集成微信/飞书/钉钉/Telegram/邮件/ntfy/bark/slack 等渠道智能推送。
mcp-for-beginners
15.8kThis open-source curriculum introduces the fundamentals of Model Context Protocol (MCP) through real-world, cross-language examples in .NET, Java, TypeScript, JavaScript, Rust and Python. Designed for developers, it focuses on practical techniques for building modular, scalable, and secure AI workflows from session setup to service orchestration.
