DICTOL
DICTOL - A Dictionary Learning Toolbox in Matlab and Python
Install / Use
/learn @tiepvupsu/DICTOLREADME
This repo is no longer maintained!
DICTOL - A Discriminative dictionary Learning Toolbox for Classification (MATLAB version).
This Toolbox is a part of our LRSDL project.
Related publications:
-
Tiep H. Vu, Vishal Monga. "Fast Low-rank Shared Dictionary Learning for Image Classification." to appear in IEEE Transactions on Image Processing. [paper].
-
Tiep H. Vu, Vishal Monga. "Learning a low-rank shared dictionary for object classification." International Conference on Image Processing (ICIP) 2016. [paper].
Author: Tiep Vu
Run DICTOL_demo.m to see example
If you experience any bugs, please let me know via the Issues tab. I'd really appreciate and fix the error ASAP. Thank you.
On this page:
<!-- MarkdownTOC -->- Notation
- Sparse Representation-based classification (SRC)
- Online Dictionary Learning (ODL)
- LCKSVD
- Dictionary learning with structured incoherence and shared features (DLSI)
- Dictionary learning for separating the particularity and the commonality (COPAR)
- LRSDL
- Fisher discrimination dictionary learning (FDDL)
- Discriminative Feature-Oriented dictionary learning (DFDL)
- D2L2R2
- Fast iterative shrinkage-thresholding algorithm (FISTA)
- References
<a name="notation"></a>
Notation
Y: signals. Each column is one observation.D: dictionary.X: sparse coefficient.d: signal dimension.d = size(Y, 1).C: number of classes.c: class index.n_c: number of training samples in classc. Typically, alln_care the same and equal ton.N: total number of training samples.Y_range: an array storing range of each class, suppose that labels are sorted in a ascending order. Example: IfY_range = [0, 10, 25], then:- There are two classes, samples from class 1 range from 1 to 10, from class 2 range from 11 to 25.
- In general, samples from class
crange fromY_range(c) + 1toY_range(c+1) - We can observe that number of classes
C = numel(Y_range) - 1.
k_c: number of bases in class-specific dictionaryc. Typically, alln_care the same and equal tok.k_0: number of bases in the shared-dictionaryK: total number of dictionary bases.D_range: similar toY_rangebut used for dictionary without the shared dictionary.
<a name="sparse-representation-based-classification-src"></a>
Sparse Representation-based classification (SRC)
- Sparse Representation-based classification implementation [1].
- Classification based on SRC.
- Syntax:
[pred, X] = SRC_pred(Y, D, D_range, opts)- INPUT:
Y: test samples.D: the total dictionary.D = [D_1, D_2, ..., D_C]withD_cbeing the c-th class-specific dictionary.D_range: range of class-specific dictionaries inD. See also Notation.opts: options.opts.lambda:lambdafor the Lasso problem. Default:0.01.opts.max_iter: maximum iterations of fista algorithm. Default:100. Check this implementation of FISTA
- OUTPUT:
pred: predicted labels of test samples.X: solution of the lasso problem.
- INPUT:
<a name="online-dictionary-learning-odl"></a>
Online Dictionary Learning (ODL)
- An implementation of the well-known Online Dictionary Learning method [2].
<a name="cost-function"></a>
Cost function
<img src = "latex/ODL_cost.png" height = "40"/><a name="training-odl"></a>
Training ODL
- Syntax:
[D, X] = ODL(Y, k, lambda, opts, sc_method)- INPUT:
- OUTPUT:
D, X: as in the problem.
<a name="lcksvd"></a>
LCKSVD
Check its project page <a name="dictionary-learning-with-structured-incoherence-and-shared-features-dlsi"></a>
Dictionary learning with structured incoherence and shared features (DLSI)
- An implementation of the well-known DLSI method [5].
<a name="cost-function-1"></a>
Cost function
<img src = "latex/DLSI_cost.png" height = "50"/><a name="training-dlsi"></a>
Training DLSI
- function
[D, X, rt] = DLSI(Y, Y_range, opts) - The main DLSI algorithm
- INPUT:
Y, Y_range: training samples and their labelsopts:opts.lambda, opts.eta:lambdaandetain the cost functionopts.max_iter: maximum iterations.
- OUTPUT:
D: the trained dictionary,X: the trained sparse coefficient,rt: total running time of the training process.
<a name="dlsi-predict-new-samples"></a>
DLSI predict new samples
- function
pred = DLSI_pred(Y, D, opts) - predict the label of new input
Ygiven the trained dictionaryDand parameters stored inopts
<a name="demo"></a>
Demo
Run DLSI_top in Matlab command window.
<a name="dictionary-learning-for-separating-the-particularity-and-the-commonality-copar"></a>
Dictionary learning for separating the particularity and the commonality (COPAR)
- An implementation of COPAR [7].
<a name="cost-function-2"></a>
Cost function
<img src = "latex/COPAR_cost.png" height = "50"/>where:
<img src = "latex/COPAR_cost1.png" height = "50"/> <a name="training-copar"></a>Training COPAR
-
function
[D, X, rt] = COPAR(Y, Y_range, opts) -
INPUT:
Y, Y_range: training samples and their labelsopts: a structopts.lambda, opts.eta:lambdaandetain the cost functionopts.max_iter: maximum iterations.
-
OUTPUT:
D: the trained dictionary,X: the trained sparse coefficient,rt: total running time of the training process.
<a name="copar-predect-new-samples"></a>
COPAR predect new samples
-
function pred = COPAR_pred(Y, D, D_range_ext, opts)
-
predict label of the input Y
-
INPUT:
Y: test samplesD: the trained dictionaryD_range_ext: range of class-specific and shared dictionaries inD. The shared dictionary is located at the end ofD.opts: a struct of options:
opts.classify_mode: a string of classification mode. either'GC'(global coding) or'LC'(local coding)opts.lambda, opts.eta, opts.max_iter: as inCOPAR.m.
-
OUTPUT:
pred: predicted labels ofY.
<a name="demo-1"></a>
Demo
Run COPAR_top in the Matlab command window.
<a name="lrsdl"></a>
LRSDL
- An implementation of COPAR [8].
<a name="motivation"></a>
Motivation

<a name="cost-function-3"></a>
Cost function
Note that unlike COPAR, in LRSDL, we separate the class-specific dictionaries (D) and the shared dictionary (D_0). The sparse coefficients (X, X^0) are also separated.
<a name="training-lrsdl"></a>
Training LRSDL
-
function `[D, D0, X, X0, CoefM, coefM0, opts, rt] = LRSDL(Y, train_label, opts)``
-
INPUT:
Y, Y_range: training samples and their labelsopts: a structopts.lambda1, opts.lambda:lambda1andlambda2in the cost function,opts.lambda3:etain the cost function (fix later),opts.max_iter: maximum iterations,opts.D_range: range of the trained dictionary,opts.k0: size of the shared dictionary
-
OUTPUT:
D, D0, X, X0: trained matrices as in the cost function,CoefM: the mean matrix.CoefM(:, c)is the mean vector ofX_c(mean of columns).CoefM0: the mean vector ofX0,rt: total running time (in seconds).
<a name="lrsdl-predict-new-samples"></a>
LRSDL predict new samples
See LRSDL_pred_GC.m function
<a name="demo-2"></a>
Demo
Run LRSDL_top in the Matlab command window.
<a name="fisher-discrimination-dictionary-learning-fddl"></a>
Related Skills
proje
Interactive vocabulary learning platform with smart flashcards and spaced repetition for effective language acquisition.
YC-Killer
2.7kA library of enterprise-grade AI agents designed to democratize artificial intelligence and provide free, open-source alternatives to overvalued Y Combinator startups. If you are excited about democratizing AI access & AI agents, please star ⭐️ this repository and use the link in the readme to join our open source AI research team.
best-practices-researcher
The most comprehensive Claude Code skills registry | Web Search: https://skills-registry-web.vercel.app
flutter-tutor
Flutter Learning Tutor Guide You are a friendly computer science tutor specializing in Flutter development. Your role is to guide the student through learning Flutter step by step, not to provide d
