SkillAgentSearch skills...

Bcg

Blended Conditional Gradient (BCG) Algorithm Package in Python

Install / Use

/learn @pokutta/Bcg
About this skill

Quality Score

0/100

Supported Platforms

Universal

README

Blended Conditional Gradient -- Frank-Wolfe algorithm enhanced with gradient descent

This package implements an algorithm for minimizing a smooth convex function over a polytope P, that combines the Frank-Wolfe algorithm (also called conditional gradient) with gradient-based steps. Different from the pairwise-step and away-step variants of Frank-Wolfe algorithm, it makes better use of the active vertices via gradient steps with a guarantee of significant progress.

For more information on this algorithm, please refer to the paper Blended Conditional Gradients: the unconditioning of conditional gradients

Prerequisites

  • Optional: LP solver (example: Gurobi)
    • LP oracles can also be directly defined not relying on Gurobi etc.
  • package autograd (optional): simplifies sepcifying gradients.

Installation

Standard method:

git clone https://github.com/pokutta/bcg.git
pip install .

Source code is available at https://github.com/pokutta/BCG.

Changelog

Version 0.1.0: initial release

Known Limitations

Many... but we are working on them. Most importantly:

  1. Currently the code is not thread safe
    (considered critical and will be addressed in one of the next versions)

Get Started

BCG is the only function you may need to call to use this package.The following is a detailed explanation on its input parameters and return value.

Syntax: BCG(f, f_grad, model, run_config=None)

  • Parameters
    • f: callable func(x).
      Objective function. It should be smooth convex.
    • f_grad: callable f_grad(x), optional.
      Gradient of the objective function. If set to None, gradient will be determined automatically, using package autograd if available, otherwise a possibly slow approximation will be used.
    • model: {filename, Class}, used for constructing feasible region.
      BCG function accepts the following two arguments as the source of feasible region:
      1. LP file (Gurobi will be used as LP solver by default. Make changes in LPsolver.py if you want to use other LP solvers)
      2. user-defined model class variable(LP Oracle at the gradient vector should be defined inside)
    • run_config: a dictionary, optional
      If not provided, the following default setting will be used:
      run_config = {
          'solution_only': True,
          'verbosity': 'normal',
          'dual_gap_acc': 1e-06,
          'runningTimeLimit': None,
          'use_LPSep_oracle': True,
          'max_lsFW': 30,
          'strict_dropSteps': True,
          'max_stepsSub': 200,
          'max_lsSub': 30,
          'LPsolver_timelimit': 100,
          'K': 1
      }
      
      Explanation of the above configuration keys can be found here.
  • Returns: 3-d list
    First element is the optimal point (a n-dimension vector) and the second element is the dual bound when the algorithm ends, and the last element is the primal function value on current optimal point.
    The logging file and a dictionary containing algorithm performance information will also be saved if run_configuration['solution_only'] is set to False.

Example Usage 1: with a LP file as feasible region

import numpy as np
from bcg.run_BCG import BCG

# define function evaluation oracle and its gradient oracle
def f(x):
    return np.linalg.norm(x, ord=2)**2

def f_grad(x):
    return 2*x

res = BCG(f, f_grad, 'spt10.lp')
print('optimal solution {}'.format(res[0]))
print('dual_bound {}'.format(res[1]))
print('primal value {}'.format(res[2]))

You can also construct your own configuration dictionary. For example, let's set the 'solution_only' to be False here.

config_dictionary = {
    'solution_only': False,
    'verbosity': 'verbose',
    'dual_gap_acc': 1e-06,
    'runningTimeLimit': 2,
    'use_LPSep_oracle': True,
    'max_lsFW': 30,
    'strict_dropSteps': True,
    'max_stepsSub': 200,
    'max_lsSub': 30,
    'LPsolver_timelimit': 100,
    'K': 1.1
}
         
res = BCG(f, f_grad, 'spt10.lp', run_config=config_dictionary)
 
print('optimal solution {}'.format(res[0]))
print('dual_bound {}'.format(res[1]))
print('primal value {}'.format(res[2]))

Example Usage 2: with a user-defined model class as feasible region

import numpy as np
from bcg.run_BCG import BCG

# define a model class to construct feasible region
# the following example is L1 ball, other examples can be found at example_model_class.py
from bcg.model_init import Model

class Model_l1_ball(Model):
    def minimize(self, gradient_at_x=None):
        result = np.zeros(self.dimension)
        if gradient_at_x is None:
            result[0] = 1
        else:
            i = np.argmax(np.abs(gradient_at_x))
            result[i] = -1 if gradient_at_x[i] > 0 else 1
        return result

l1Ball = Model_l1_ball(100)  # initialize the feasible region as a L1 ball of dimension 100

# define function evaluation oracle and its gradient oracle
# the following example function is (x-shift)^2, where x is a n dimension vector
shift = np.random.randn(100)
def f(x):
    return np.linalg.norm(x - shift, ord=2)**2

def f_grad(x):
    return 2*(x - shift)


res = BCG(f, f_grad, l1Ball)
print('optimal solution {}'.format(res[0]))
print('dual_bound {}'.format(res[1]))
print('primal value {}'.format(res[2]))

Output Example

Using provided gradient.
dimension of this feasible_region is 100
checking validity of function gradient...
╭────────────────────────┬────────────────────────┬────────────────────────┬────────────────────────┬────────────────────────┬────────────────────────╮
│       Iteration        │          Type          │     Function Value     │       Dual Bound       │         #Atoms         │         WTime          │
├────────────────────────┼────────────────────────┼────────────────────────┼────────────────────────┼────────────────────────┼────────────────────────┤
│                      1 │                     FI │      91.52256902109498 │      2.945672332562157 │                      1 │                 0.0030 │
│                      2 │                     FN │      91.21266631260148 │      2.945672332562157 │                      2 │                 0.0041 │
│                      3 │                     FI │      91.12222718958402 │     0.7872772175737643 │                      3 │                 0.0099 │
│                      4 │                     FN │      91.09363112732508 │     0.7872772175737643 │                      4 │                 0.0115 │
│                      5 │                     FI │      91.07786291882721 │    0.19708572371482436 │                      5 │                 0.0125 │
│                      6 │                      P │      91.07469121674026 │    0.19708572371482436 │                      5 │                 0.0133 │
│                      7 │                     FN │      91.07380013379127 │    0.19708572371482436 │                      5 │                 0.0145 │
│                      8 │                      P │       91.0733404209301 │   0.028801617894984144 │                      5 │                 0.0151 │
│                      9 │                      P │      91.07321459940893 │   0.028801617894984144 │                      5 │                 0.0158 │
│                     10 │                     FN │      91.07319272605285 │   0.028801617894984144 │                      5 │                 0.0169 │
│                     11 │                      P │      91.07318511979354 │   0.004507642594342887 │                      5 │                 0.0175 │
│                     12 │                      P │       91.0731824553222 │   0.004507642594342887 │                      5 │                 0.0184 │
│                     13 │                     FN │      91.07318194664359 │   0.004507642594342887 │                      5 │                 0.0195 │
│                     14 │                      P │      91.07318186214543 │  0.0007185411138372899 │                      5 │                 0.0201 │
│                     15 │                     FN │      91.07318186214543 │  0.0007185411138372899 │                      5 │                 0.0213 │
│                     16 │                      P │      91.07318186214513 │  6.007732255008946e-07 │                      5 │                 0.0220 │
Achieved required dual gap accuracy, save results and exit BCG algorithm
╰────────────────────────┴────────────────────────┴────────────────────────┴────────────────────────┴────────────────────────┴────────────────────────╯
optimal solution [ 0.          0.          0.40293517  0.          0.          0.          0.
  0.          0.          0.          0.          0.          0.          0.
  0.          0.          0.          0.          0.          0.          0.
  0.          0.          0.          0.          0.          0.          0.
  0.          0.          0.          0.          0.          0.
 -0.11816443  0.          0.          0.          0.          0.          0.
  0.          0.          0.          0.          0.          0.
 -0.12102891  0.          0.          0.          0.          0.          0.
  0.          0.          0.          0.          0.          0.          0.
  0.          0.          0.16765871  0.          0.          0.          0.
  0.          0.          0.          0.          0.          0.          0.
  0.          0.          0.          0.          0.          0.          0.
  0.          0.          0.          0.          0.          0.          0.
  0.          0.          0.          0.          0.          0.          0.
  0.          0.          0.          0.19021278]
dual_bound 6.007732255008946e-07
primal value 91.07318186214513

Configuration dictionary

  • solution_only: binary
    If set to True, output will be the optimal solution and dual

Related Skills

View on GitHub
GitHub Stars5
CategoryDevelopment
Updated4mo ago
Forks1

Languages

Python

Security Score

82/100

Audited on Nov 21, 2025

No findings