29 skills found
TrustAGI-Lab / ARGAThis is a TensorFlow implementation of the Adversarially Regularized Graph Autoencoder(ARGA) model as described in our paper: Pan, S., Hu, R., Long, G., Jiang, J., Yao, L., & Zhang, C. (2018). Adversarially Regularized Graph Autoencoder for Graph Embedding, [https://www.ijcai.org/proceedings/2018/0362.pdf].
Zhongdao / VehicleReIDKeyPointDataAnnotations of key point location and vehicle orientation for VeRi-776 dataset. ICCV'17 paper: Orientation Invariant Feature Embedding and Spatial Temporal Regularization for Vehicle Re-identification.
williamgilpin / FnnEmbed strange attractors using a regularizer for autoencoders
wuliwei9278 / SSE PTCodes and Datasets for paper RecSys'20 "SSE-PT: Sequential Recommendation Via Personalized Transformer" and NurIPS'19 "Stochastic Shared Embeddings: Data-driven Regularization of Embedding Layers"
pierre-jacob / ICCV2019 HordeCode repository for our paper entilted "Metric Learning with HORDE: High-Order Regularizer for Deep Embeddings" accepted at ICCV 2019.
jingjin25 / LFSSR ATORepository for "Light Field Spatial Super-resolution via Deep Combinatorial Geometry Embedding and Structural Consistency Regularization" , CVPR 2020
thanhdtran / RMERegularizing Matrix Factorization with User and Item Embeddings for Recommendation -- CIKM 2018
bobxwu / ECRTMCode for Effective Neural Topic Modeling with Embedding Clustering Regularization (ICML2023)
zhongyy / Adversarial MTERCode for ICCV2019 paper《Adversarial Learning with Margin-based Triplet Embedding Regularization》
kaist-dmlab / TRAPTRAP: Two-level Regularized Autoencoder-based Embedding for Power-law Distributed Data
l7170 / PTA HADa prior-based tensor approximation (PTA) is proposed for hyperspectral anomaly detection, in which HSI is decomposed into a background tensor and an anomaly tensor. In the background tensor, a low-rank prior is incorporated into spectral dimension by truncated nuclear norm regularization, and a piecewise- smooth prior on spatial dimension can be embedded by a linear total variation-norm regularization. For anomaly tensor, it is unfolded along spectral dimension coupled with spatial group sparse prior that can be represented by l 2,1 -norm regularization.
wuliwei9278 / SSEPartial Codes and datasets for NeurIPS'19 "Stochastic Shared Embeddings: Data-driven Regularization of Embedding Layers"
jlgleason / Hts Constrained EmbeddingsReplication material for "Forecasting Hierarchical Time Series with a Regularized Embedding Space," KDD MileTS 2020
wtyhub / DWDRPytorch implementation of Learning Cross-view Geo-localization Embeddings via Dynamic Weighted Decorrelation Regularization https://arxiv.org/abs/2211.05296
sh0416 / OommixOfficial implementation for ACL2021 Oral Paper: "OoMMix: Out-of-manifold Regularization in Contextual Embedding Space for Text Classification"
CSLT-THU / IS2019 VAETensorflow and kaldi implementation of our paper "VAE-based regularization for deep speaker embedding"
leimao / Tensorflow Assignment SolutionsThese are my solutions to all six assignments of tensorflow tutorial in Udacity, covering CNN, RNN, Regularization (L2 and dropout), Embeddings (word2vec) and Seq2Seq LSTM (bigrams prediction and sequence mirror)
pangy9 / CoReOfficial implementation of "CoRe: Context-Regularized Text Embedding Learning for Text-to-Image Personalization".
HKUST-KnowComp / SRBRWSource Code for IJCAI 2018 paper "Biased Random Walk based Social Regularization for Word Embeddings"
youqiangao / WmmdOfficial PyTorch implementation of JASA paper "Word-Level Maximum Mean Discrepancy Regularization for Word Embedding"