DistributedMachineLearning
This work combines differential privacy and multi-party computation protocol to achieve distributed machine learning.
Install / Use
/learn @bargavj/DistributedMachineLearningREADME
Distributed Privacy-Preserving Empirical Risk Minimization
This work combines differential privacy and multi-party computation protocol to achieve distributed machine learning. Based on the paper "Distributed Learning without Distress: Privacy-Preserving Empirical Risk Minimization" (http://papers.nips.cc/paper/7871-distributed-learning-without-distress-privacy-preserving-empirical-risk-minimization) that has been accepted at NIPS 2018.
The code contains privacy preserving implementation of L2 Regularized Logistic Regression and Linear Regression models.
Requirements
- Python 2.7 or above
- Numpy
- Scikit Learn
- Obliv-C
- Absentminded Crypto Toolkit
- Cycle Utility
Code Execution
Execute make files in model_aggregate_gaussian and model_aggregate_laplace directories using make command to obtain the respective a.out executable files.
Run python model_wrapper.py
Related Skills
proje
Interactive vocabulary learning platform with smart flashcards and spaced repetition for effective language acquisition.
YC-Killer
2.7kA library of enterprise-grade AI agents designed to democratize artificial intelligence and provide free, open-source alternatives to overvalued Y Combinator startups. If you are excited about democratizing AI access & AI agents, please star ⭐️ this repository and use the link in the readme to join our open source AI research team.
best-practices-researcher
The most comprehensive Claude Code skills registry | Web Search: https://skills-registry-web.vercel.app
flutter-tutor
Flutter Learning Tutor Guide You are a friendly computer science tutor specializing in Flutter development. Your role is to guide the student through learning Flutter step by step, not to provide d
