92 skills found · Page 3 of 4
bernard24 / ConvexTensorMatlab code for the paper "A New Convex Relaxation for Tensor Completion"
xinychen / Tensor CompletionLow-rank tensor completion algorithm - HaLRTC.
dongwookim-ml / Almcactive learning algorithms for matrix and tensor completion
nrpowell / Grakn KbcThe neural tensor network KB completion built on Grakn
claws-lab / DAINCode for the ACM CIKM 2021 paper "Influence-guided Data Augmentation for Neural Tensor Completion"
mtntruong / UTeRMPython code for "Deep Unfolding Tensor Rank Minimization with Generalized Detail Injection for Pansharpening", IEEE TGRS 2024
polwork / LRTC DMMixed Norm Regularized Models for Low-Rank Tensor Completion
longzhen520 / STTC Codethe code is for the paper Image Completion Using Low Tensor Tree Rank and Total Variation Minimization
xumaomao94 / PTTCcode for "Tensor Train Factorization under Noisy and Incomplete Data with Automatic Rank Estimation" and "Overfitting Avoidance in Tensor Train Factorization and Completion: Prior Analysis and Inference"
xjzhang008 / LRTCPoissonCode for the paper "X. Zhang and M. K. Ng. Low Rank Tensor Completion with Poisson Observations, IEEE TPAMI, 2021."
dair-iitd / OxKBCState-of-the-art models for Knowledge Base Completion (KBC) for large KBs (such as FB15k1and YAGO) are based on tensor factorization (TF), e.g, DistMult, ComplEx. While they produce2good results, they cannot expose any rationale behind their predictions, potentially reducing the3trust of a user in the outcome of the model. Previous works have explored creating an inherently4explainable model, e.g. Neural Theorem Proving (NTP), DeepPath, MINERVA, but explainability5in them comes at the cost of performance. Others have tried to create an auxiliary explainable6model having high fidelity with the underlying TF model, but unfortunately, they do not scale well7to large KBs. In this work, we proposeOXKBC– anOutcome eXplanation engine forKBC,8which provides a post-hoc explanation for every triple inferred by a (uninterpretable) factorization9based model. It first augments the underlying Knowledge Graph by introducing weighted edges10between entities based on their similarity given by the underlying model. It then defines a notion11of human-understandable explanation paths along with a language to generate them. Depending12on the edges, the paths are aggregated into second–order templates for further selection. The best13template with its grounding is then selected by a neural selection module that is trained with minimal14supervision by a novel loss function. Experiments over Mechanical Turk demonstrate that users15overwhelmingly find our explanations more trustworthy compared to rule mining.
yqx7150 / FoE STDCField-of-Experts Filters Guided Tensor Completion
zhaoxile / Tensor Completion Using Total Variation And Low Rank Matrix Factorizationcode of Tensor completion using total variation and low-rank matrix factorization
Qinwenjinswu / TIP CodeLow-Rank High-Order Tensor Completion With Applications in Visual Data
donalee / FtcomFast Tucker Factorization for Large-Scale Tensor Completion
zhaoxile / Tensor Factorization With Total Variation And Tikhonov Regularization For Low Rank Tensor Completioncode of Tensor Factorization with Total Variation and Tikhonov Regularization for Low-Rank Tensor Completion in Imaging Data
TiantianUpup / Tensor Completion张量填充算法实现
benzhengli / NTTNN CodeMatlab code for "Nonlinear Transform Induced Tensor Nuclear Norm for Tensor Completion"
ynqiu / BalancedTNNSource code of Balanced Unfolding Induced Tensor Nuclear Norms for High-order Tensor Completion
Li-X-P / Code Of Robust Tensor CompletionNo description available