19 skills found
ZIB-IOL / FrankWolfe.jlJulia implementation for various Frank-Wolfe and Conditional Gradient variants
andi611 / Conditional SeqGAN TensorflowConditional Sequence Generative Adversarial Network trained with policy gradient, Implementation in Tensorflow
gcucurull / Cond Wgan GpPytorch implementation of a Conditional WGAN with Gradient Penalty
marzekan / WCGAN GPTensorFlow 2 implementation of Wasserstein Conditional GAN with Gradient Penalty (WCGAN-GP) for synthetic data generation
reddyprasade / Machine Learning Interview PreparationPrepare to Technical Skills Here are the essential skills that a Machine Learning Engineer needs, as mentioned Read me files. Within each group are topics that you should be familiar with. Study Tip: Copy and paste this list into a document and save to your computer for easy referral. Computer Science Fundamentals and Programming Topics Data structures: Lists, stacks, queues, strings, hash maps, vectors, matrices, classes & objects, trees, graphs, etc. Algorithms: Recursion, searching, sorting, optimization, dynamic programming, etc. Computability and complexity: P vs. NP, NP-complete problems, big-O notation, approximate algorithms, etc. Computer architecture: Memory, cache, bandwidth, threads & processes, deadlocks, etc. Probability and Statistics Topics Basic probability: Conditional probability, Bayes rule, likelihood, independence, etc. Probabilistic models: Bayes Nets, Markov Decision Processes, Hidden Markov Models, etc. Statistical measures: Mean, median, mode, variance, population parameters vs. sample statistics etc. Proximity and error metrics: Cosine similarity, mean-squared error, Manhattan and Euclidean distance, log-loss, etc. Distributions and random sampling: Uniform, normal, binomial, Poisson, etc. Analysis methods: ANOVA, hypothesis testing, factor analysis, etc. Data Modeling and Evaluation Topics Data preprocessing: Munging/wrangling, transforming, aggregating, etc. Pattern recognition: Correlations, clusters, trends, outliers & anomalies, etc. Dimensionality reduction: Eigenvectors, Principal Component Analysis, etc. Prediction: Classification, regression, sequence prediction, etc.; suitable error/accuracy metrics. Evaluation: Training-testing split, sequential vs. randomized cross-validation, etc. Applying Machine Learning Algorithms and Libraries Topics Models: Parametric vs. nonparametric, decision tree, nearest neighbor, neural net, support vector machine, ensemble of multiple models, etc. Learning procedure: Linear regression, gradient descent, genetic algorithms, bagging, boosting, and other model-specific methods; regularization, hyperparameter tuning, etc. Tradeoffs and gotchas: Relative advantages and disadvantages, bias and variance, overfitting and underfitting, vanishing/exploding gradients, missing data, data leakage, etc. Software Engineering and System Design Topics Software interface: Library calls, REST APIs, data collection endpoints, database queries, etc. User interface: Capturing user inputs & application events, displaying results & visualization, etc. Scalability: Map-reduce, distributed processing, etc. Deployment: Cloud hosting, containers & instances, microservices, etc. Move on to the final lesson of this course to find lots of sample practice questions for each topic!
georgehalal / CWGAN GPA conditional Wasserstein Generative Adversarial Network with gradient penalty (cWGAN-GP) for stochastic generation of galaxy properties in wide-field surveys
dalisc / Color Scales JsA utility mimicking Microsoft Excel's Color Scales conditional formatting, which returns the color of a value in a linear gradient between two color endpoints with defined min and max values
ZIB-IOL / CINDyCINDy: Conditional gradient-based Identification of Non-linear Dynamics – Noise-robust recovery
ZIB-IOL / BellPolytopes.jlBell inequalities and local models via Frank-Wolfe algorithms
HIPS / AutopaintGradient-based variational autoencoders to generate class-conditional natural images.
zhengqigao / REG[ICML'25] REG: Rectified Gradient Guidance for Conditional Diffusion Models
anandharaju / DimensionalityReduction T SNEt-Distributed Stochastic Neighbor Embedding (t-SNE) is a non-linear technique for dimensionality reduction that is particularly well suited for the visualization of high-dimensional datasets. It is extensively applied in image processing, NLP, genomic data and speech processing. To keep things simple, here’s a brief overview of working of t-SNE: The algorithms starts by calculating the probability of similarity of points in high-dimensional space and calculating the probability of similarity of points in the corresponding low-dimensional space. The similarity of points is calculated as the conditional probability that a point A would choose point B as its neighbor if neighbors were picked in proportion to their probability density under a Gaussian (normal distribution) centered at A. It then tries to minimize the difference between these conditional probabilities (or similarities) in higher-dimensional and lower-dimensional space for a perfect representation of data points in lower-dimensional space. To measure the minimization of the sum of difference of conditional probability t-SNE minimizes the sum of Kullback-Leibler divergence of overall data points using a gradient descent method. Note Kullback-Leibler divergence or KL divergence is is a measure of how one probability distribution diverges from a second, expected probability distribution. Those who are interested in knowing the detailed working of an algorithm can refer to this research paper. In simpler terms, t-Distributed stochastic neighbor embedding (t-SNE) minimizes the divergence between two distributions: a distribution that measures pairwise similarities of the input objects and a distribution that measures pairwise similarities of the corresponding low-dimensional points in the embedding. In this way, t-SNE maps the multi-dimensional data to a lower dimensional space and attempts to find patterns in the data by identifying observed clusters based on similarity of data points with multiple features. However, after this process, the input features are no longer identifiable, and you cannot make any inference based only on the output of t-SNE. Hence it is mainly a data exploration and visualization technique.
nboyd / SatsplineSaturating splines using the conditional gradient method
ZIB-IOL / FrankWolfe Book CodePython implementation of Frank-Wolfe and Conditional Gradient algorithms
pokutta / BcgBlended Conditional Gradient (BCG) Algorithm Package in Python
jobstdavid / BoostCopulaGradient-Boosted Estimation of Generalized Linear Models for Conditional Vine Copulas
liangkunn / Generative Models In Medical ImagesA repository for data augmentation in medical images, using VAE and GANs (conditional GANs, Wasserstein GAN with/without Gradient Penalty)
alpyurtsever / SHCGMMATLAB implementation of the Stochastic Homotopy Conditional Gradient Method
ZIB-IOL / CGAVICode for the paper: Wirth, E.S. and Pokutta, S., 2022, May. Conditional gradients for the approximately vanishing ideal. In International Conference on Artificial Intelligence and Statistics (pp. 2191-2209). PMLR.