SkillAgentSearch skills...

DICTOL

DICTOL - A Dictionary Learning Toolbox in Matlab and Python

Install / Use

/learn @tiepvupsu/DICTOL
About this skill

Quality Score

0/100

Supported Platforms

Universal

README

This repo is no longer maintained!

DICTOL - A Discriminative dictionary Learning Toolbox for Classification (MATLAB version).

This Toolbox is a part of our LRSDL project.

See Python version.

Related publications:

  1. Tiep H. Vu, Vishal Monga. "Fast Low-rank Shared Dictionary Learning for Image Classification." to appear in IEEE Transactions on Image Processing. [paper].

  2. Tiep H. Vu, Vishal Monga. "Learning a low-rank shared dictionary for object classification." International Conference on Image Processing (ICIP) 2016. [paper].

Author: Tiep Vu

Run DICTOL_demo.m to see example

If you experience any bugs, please let me know via the Issues tab. I'd really appreciate and fix the error ASAP. Thank you.

On this page:

<!-- MarkdownTOC --> <!-- /MarkdownTOC -->

<a name="notation"></a>

Notation

  • Y: signals. Each column is one observation.
  • D: dictionary.
  • X: sparse coefficient.
  • d: signal dimension. d = size(Y, 1).
  • C: number of classes.
  • c: class index.
  • n_c: number of training samples in class c. Typically, all n_c are the same and equal to n.
  • N: total number of training samples.
  • Y_range: an array storing range of each class, suppose that labels are sorted in a ascending order. Example: If Y_range = [0, 10, 25], then:
    • There are two classes, samples from class 1 range from 1 to 10, from class 2 range from 11 to 25.
    • In general, samples from class c range from Y_range(c) + 1 to Y_range(c+1)
    • We can observe that number of classes C = numel(Y_range) - 1.
  • k_c: number of bases in class-specific dictionary c. Typically, all n_c are the same and equal to k.
  • k_0: number of bases in the shared-dictionary
  • K: total number of dictionary bases.
  • D_range: similar to Y_range but used for dictionary without the shared dictionary.

<a name="sparse-representation-based-classification-src"></a>

Sparse Representation-based classification (SRC)

  • Sparse Representation-based classification implementation [1].
  • Classification based on SRC.
  • Syntax: [pred, X] = SRC_pred(Y, D, D_range, opts)
    • INPUT:
      • Y: test samples.
      • D: the total dictionary. D = [D_1, D_2, ..., D_C] with D_c being the c-th class-specific dictionary.
      • D_range: range of class-specific dictionaries in D. See also Notation.
      • opts: options.
    • OUTPUT:
      • pred: predicted labels of test samples.
      • X: solution of the lasso problem.

<a name="online-dictionary-learning-odl"></a>

Online Dictionary Learning (ODL)

  • An implementation of the well-known Online Dictionary Learning method [2].

<a name="cost-function"></a>

Cost function

<img src = "latex/ODL_cost.png" height = "40"/>

<a name="training-odl"></a>

Training ODL

  • Syntax: [D, X] = ODL(Y, k, lambda, opts, sc_method)
    • INPUT:
      • Y: collection of samples.
      • k: number of bases in the desired dictionary.
      • lambda: norm 1 regularization parameter.
      • opts: option.
      • sc_method: sparse coding method used in the sparse coefficient update. Possible values:
        • 'fista': using FISTA algorithm. See also fista.
        • 'spams': using SPAMS toolbox [12].
    • OUTPUT:
      • D, X: as in the problem.

<a name="lcksvd"></a>

LCKSVD

Check its project page <a name="dictionary-learning-with-structured-incoherence-and-shared-features-dlsi"></a>

Dictionary learning with structured incoherence and shared features (DLSI)

  • An implementation of the well-known DLSI method [5].

<a name="cost-function-1"></a>

Cost function

<img src = "latex/DLSI_cost.png" height = "50"/>

<a name="training-dlsi"></a>

Training DLSI

  • function [D, X, rt] = DLSI(Y, Y_range, opts)
  • The main DLSI algorithm
  • INPUT:
    • Y, Y_range: training samples and their labels
    • opts:
      • opts.lambda, opts.eta: lambda and eta in the cost function
      • opts.max_iter: maximum iterations.
  • OUTPUT:
    • D: the trained dictionary,
    • X: the trained sparse coefficient,
    • rt: total running time of the training process.

<a name="dlsi-predict-new-samples"></a>

DLSI predict new samples

  • function pred = DLSI_pred(Y, D, opts)
  • predict the label of new input Y given the trained dictionary D and parameters stored in opts

<a name="demo"></a>

Demo

Run DLSI_top in Matlab command window.

<a name="dictionary-learning-for-separating-the-particularity-and-the-commonality-copar"></a>

Dictionary learning for separating the particularity and the commonality (COPAR)

  • An implementation of COPAR [7].

<a name="cost-function-2"></a>

Cost function

<img src = "latex/COPAR_cost.png" height = "50"/>

where:

<img src = "latex/COPAR_cost1.png" height = "50"/> <a name="training-copar"></a>

Training COPAR

  • function [D, X, rt] = COPAR(Y, Y_range, opts)

  • INPUT:

    • Y, Y_range: training samples and their labels
    • opts: a struct
      • opts.lambda, opts.eta: lambda and eta in the cost function
      • opts.max_iter: maximum iterations.
  • OUTPUT:

    • D: the trained dictionary,
    • X: the trained sparse coefficient,
    • rt: total running time of the training process.

<a name="copar-predect-new-samples"></a>

COPAR predect new samples

  • function pred = COPAR_pred(Y, D, D_range_ext, opts)

  • predict label of the input Y

  • INPUT:

    • Y: test samples
    • D: the trained dictionary
    • D_range_ext: range of class-specific and shared dictionaries in D. The shared dictionary is located at the end of D.
    • opts: a struct of options:
    • opts.classify_mode: a string of classification mode. either 'GC' (global coding) or 'LC' (local coding)
    • opts.lambda, opts.eta, opts.max_iter: as in COPAR.m.
  • OUTPUT:

    • pred: predicted labels of Y.

<a name="demo-1"></a>

Demo

Run COPAR_top in the Matlab command window.

<a name="lrsdl"></a>

LRSDL

  • An implementation of COPAR [8].

<a name="motivation"></a>

Motivation

<a name="cost-function-3"></a>

Cost function

Note that unlike COPAR, in LRSDL, we separate the class-specific dictionaries (D) and the shared dictionary (D_0). The sparse coefficients (X, X^0) are also separated.

<a name="training-lrsdl"></a>

Training LRSDL

  • function `[D, D0, X, X0, CoefM, coefM0, opts, rt] = LRSDL(Y, train_label, opts)``

  • INPUT:

    • Y, Y_range: training samples and their labels
    • opts: a struct
      • opts.lambda1, opts.lambda: lambda1 and lambda2 in the cost function,
      • opts.lambda3: eta in the cost function (fix later),
      • opts.max_iter: maximum iterations,
      • opts.D_range: range of the trained dictionary,
      • opts.k0: size of the shared dictionary
  • OUTPUT:

    • D, D0, X, X0: trained matrices as in the cost function,
    • CoefM: the mean matrix. CoefM(:, c) is the mean vector of X_c (mean of columns).
    • CoefM0: the mean vector of X0,
    • rt: total running time (in seconds).

<a name="lrsdl-predict-new-samples"></a>

LRSDL predict new samples

See LRSDL_pred_GC.m function

<a name="demo-2"></a>

Demo

Run LRSDL_top in the Matlab command window.

<a name="fisher-discrimination-dictionary-learning-fddl"></a>

Related Skills

View on GitHub
GitHub Stars193
CategoryEducation
Updated1mo ago
Forks89

Languages

MATLAB

Security Score

95/100

Audited on Mar 5, 2026

No findings