21 skills found
dilinwang820 / Stein Variational Gradient Descentcode for the paper "Stein Variational Gradient Descent (SVGD): A General Purpose Bayesian Inference Algorithm"
acerbilab / VbmcVariational Bayesian Monte Carlo (VBMC) algorithm for posterior and model inference in MATLAB
acerbilab / PyvbmcPyVBMC: Variational Bayesian Monte Carlo algorithm for posterior and model inference in Python
TuringLang / AdvancedVI.jlImplementation of variational Bayes inference algorithms
Sission / Coupled VAE Improved Robustness And Accuracy Of A Variational AutoencoderWe present a coupled Variational Auto-Encoder (VAE) method that improves the accuracy and robustness of the probabilistic inferences on represented data. The new method models the dependency between input feature vectors (images) and weighs the outliers with a higher penalty by generalizing the original loss function to the coupled entropy function, using the principles of nonlinear statistical coupling. We evaluate the performance of the coupled VAE model using the MNIST dataset. Compared with the traditional VAE algorithm, the output images generated by the coupled VAE method are clearer and less blurry. The visualization of the input images embedded in 2D latent variable space provides a deeper insight into the structure of new model with coupled loss function: the latent variable has a smaller deviation and the output values are generated by a more compact latent space. We analyze the histograms of probabilities for the input images using the generalized mean metrics, in which increased geometric mean illustrates that the average likelihood of input data is improved. Increases in the -2/3 mean, which is sensitive to outliers, indicates improved robustness. The decisiveness, measured by the arithmetic mean of the likelihoods, is unchanged and -2/3 mean shows that the new model has better robustness.
DPBayes / D3pAn implementation of the differentially private variational inference algorithm for NumPyro.
jamesvuc / BBVIA collection of Black Box Variational Inference algorithms implemented in an object-oriented Python framework using Autograd.
rishabhmisra / Scalable Variational Bayesian Factorization MachineScalable Variational Bayesian inference algorithm for FM is developed which converges faster than the existing state-of-the-art MCMC based inference algorithm. Additionally, a stochastic variational Bayesian algorithm for FM is introduced for large scale learning which utilizes SGD.
linkerlin / MLRaptorEfficient online variational Bayesian inference algorithms for common machine learning tasks. Includes mixture models (like GMMs) and admixture models (like LDA). Implemented in Python.
ChampiB / Homing PigeonHoming Piegon is an inference framework implementing Variational Message Passing. It can be used to implement an Active Inference agent that performs planning using a Tree Search algorithm that can been seen as a form of Bayesian Model Expansion.
Thavisha / Dustribution"Dustribution" algorithm for mapping the 3D dust density and extinction of the Milky Way using latent variable Gaussian processes and variational inference
aaadriano / 672 Statistical LearningKernel Principal Component Analysis, Spectral Clustering, Gaussian Processes, RKHS of vector-valued functions, RKHS embedding of the realization of random variables, Tests of independence and conditional independence, Bayesian networks, k-means, mixture models, the expectation maximization algorithm, Markov random fields, Gibbs distributions, belief propagation algorithms, variational inference, Markov chain Monte Carlo.
jundsp / VbldsVariational Inference of Bayesian Linear Dynamical Systems. EM algorithm to infer and learn the dynamics of time-series data.
SourangshuGhosh / Doubly Stochastic Deep Gaussian ProcessGaussian processes (GPs) are a good choice for function approximation as they are flexible, robust to over-fitting, and provide well-calibrated predictive uncertainty. Deep Gaussian processes (DGPs) are multi-layer generalisations of GPs, but inference in these models has proved challenging. Existing approaches to inference in DGP models assume approximate posteriors that force independence between the layers, and do not work well in practice. We present a doubly stochastic variational inference algorithm, which does not force independence between layers. With our method of inference we demonstrate that a DGP model can be used effectively on data ranging in size from hundreds to a billion points. We provide strong empirical evidence that our inference scheme for DGPs works well in practice in both classification and regression.
tsmatz / GmmEstimate GMM (Gaussian Mixture Model) by applying EM Algorithm and Variational Inference (Variational Bayesian) from scratch in Python (Mar 2022)
fangleai / Variational Inference On MNISTA list of variational inference algorithms and their performance on MNIST
vrettasm / VGPAVariational Gaussian Process Approximation. This project contains a python3 implementation of the original VGPA algorithm for approximate inference in SDEs.
RottenFruits / BPMF.jlA Julia package for bayesian probabilistic matrix factorization (BPMF).
yjernite / LiftedChainMRFA fast variational inference algorithm for Markov Random Field chain models
CoopLo / SVIImplementation of the algorithm outlined here: http://proceedings.mlr.press/v32/johnson14.pdf. Specifically using stochastic variational inference to fit a hidden markov model to minute level stock data.