20 skills found
AndreiChertkov / TenevaA framework based on the tensor train decomposition for working with multivariate functions and multidimensional arrays
yunjhongwu / TensorDecompositions.jlA Julia implementation of tensor decomposition algorithms
AndreiChertkov / TtoptGradient-free optimization method for multivariable functions based on the low rank tensor train (TT) format and maximal-volume principle.
FurongHuang / ConvDicLearnTensorFactorTensor methods have emerged as a powerful paradigm for consistent learning of many latent variable models such as topic models, independent component analysis and dictionary learning. Model parameters are estimated via CP decomposition of the observed higher order input moments. However, in many domains, additional invariances such as shift invariances exist, enforced via models such as convolutional dictionary learning. In this paper, we develop novel tensor decomposition algorithms for parameter estimation of convolutional models. Our algorithm is based on the popular alternating least squares method, but with efficient projections onto the space of stacked circulant matrices. Our method is embarrassingly parallel and consists of simple operations such as fast Fourier transforms and matrix multiplications. Our algorithm converges to the dictionary much faster and more accurately compared to the alternating minimization over filters and activation maps.
HuyanHuang / Robust Low Rank Tensor Ring CompletionThis project aims to realize the robust tensor completion algorithms via tensor ring decomposition.
thanhtbt / ATT Miss[SP 2024] A Novel Recursive Least-Squares Adaptive Method For Streaming Tensor-Train Decomposition With Incomplete Observations. In Elsevier Signal Processing, 2024.
HuyanHuang / Tensor Completion Via Tensor Ring DecompositionThis project aims to realize the tensor completion algorithms via tensor ring decomposition.
thanhtbt / ROLCP[IEEE ICASSP 2021] "A fast randomized adaptive CP decomposition for streaming tensors". In 46th IEEE International Conference on Acoustics, Speech, & Signal Processing, 2021.
RikVoorhaar / Tt SketchSketching algorithms for Tensor Train decompositions
thanhtbt / Tensor Tracking[Patterns 2023] Tracking Online Low-Rank Approximations of Higher-Order Incomplete Streaming Tensors. In Patterns (Cell Press) 2023.
krzysiekfonal / TTD SRTensor Train Decomposition with Simple Randomness algorithm for very efficient and effective feature extraction
Soumya1612-Rasha / TravelTimePrediction TensorFactorization UberDataIt modifies the Uber GPS data given by Uber black cab pickups in San Francisco and passes it to an algorithm for applying tensor decomposition algorithm(Tucker Decomposition) to find missing values. Any given query path is passed to the above algorithm and it finds the travel time for it.
rmsolgi / TensorLearnA Python Package for Advanced Tensor Learning
daskol / ProtesPRobability Optimizer with TEnsor Sampling (PROTES) is an optimization algorithm based on tensor train decomposition.
OsmanMalik / TNSCode for our preprint paper titled "Sampling-Based Decomposition Algorithms for Arbitrary Tensor Networks"
irairavivi / Cp Als Qr SvdUses QR and SVD to solve for the CP-decomposition of a tensor in the CP-ALS algorithm for improved numerical stability
esinkarahan / Tensor Cmtf Eeg FmriCoupled matrix-tensor factorization for integrating EEG and ffMRI on the brain cortical surface with source reconstruction
LinjianMa / Pairwise PerturbationPairwise Perturbation: an efficient numerical algorithm for alternating least squares in tensor decompositions
karimhalaseh / Tensor Network DecompositionsImplementation of algorithms in "Orthogonal Decomposition of Tensor Trains" (Halaseh, Muller, Robeva)
umd-huang-lab / Private Topic Model Tensor MethodsWe provide an end-to-end differentially pri- vate spectral algorithm for learning LDA, based on matrix/tensor decompositions, and establish theoretical guarantees on util- ity/consistency of the estimated model pa- rameters. The spectral algorithm consists of multiple algorithmic steps, named as “edges”, to which noise could be injected to obtain differential privacy. We identify subsets of edges, named as “configurations”, such that adding noise to all edges in such a subset guarantees differential privacy of the end-to-end spectral algorithm. We character- ize the sensitivity of the edges with respect to the input and thus estimate the amount of noise to be added to each edge for any required privacy level. We then character- ize the utility loss for each configuration as a function of injected noise. Overall, by com- bining the sensitivity and utility characteri- zation, we obtain an end-to-end differentially private spectral algorithm for LDA and iden- tify the corresponding configuration that out- performs others in any specific regime. We are the first to achieve utility guarantees un- der the required level of differential privacy for learning in LDA. Overall our method sys- tematically outperforms differentially private variational inference.