SkillAgentSearch skills...

SolvOR

Optimization solvers in pure Python: LP, MILP, SAT, constraint programming, graph and metaheuristics. No dependencies. Solvor all your optimization needs.

Install / Use

/learn @StevenBtw/SolvOR

README

Build Status Docs codecov License: Apache 2.0 PyPI Downloads Python 3.12 | 3.13 | 3.14

<picture> <source media="(prefers-color-scheme: dark)" srcset="docs/assets/logo-dark.svg"> <source media="(prefers-color-scheme: light)" srcset="docs/assets/logo-light.svg"> <img alt="solvOR" src="docs/assets/logo-light.svg" width="200"> </picture>

Solvor all your optimization needs.

What's in the box?

| Category | Solvors | Use Case | |----------|---------|----------| | Linear/Integer | solve_lp, solve_lp_interior, solve_milp, solve_cg, solve_bp | Resource allocation, cutting stock | | Constraint | solve_sat, Model | Sudoku, puzzles, and that one config problem that's been bugging you | | Combinatorial | solve_knapsack, solve_bin_pack, solve_job_shop, solve_vrptw | Packing, scheduling, routing | | Local Search | anneal, tabu_search, lns, alns | TSP, combinatorial optimization | | Population | evolve, differential_evolution, particle_swarm | Global search, nature-inspired | | Gradient | gradient_descent, momentum, rmsprop, adam | ML, curve fitting | | Quasi-Newton | bfgs, lbfgs | Fast convergence, smooth functions | | Derivative-Free | nelder_mead, powell, bayesian_opt | Black-box, expensive functions | | Pathfinding | bfs, dfs, dijkstra, astar, astar_grid, bellman_ford, floyd_warshall | Shortest paths, graph traversal | | Graph | max_flow, min_cost_flow, kruskal, prim, topological_sort, strongly_connected_components, pagerank, louvain, articulation_points, bridges, kcore_decomposition | Flow, MST, dependency, centrality, community detection | | Assignment | solve_assignment, solve_hungarian, network_simplex | Matching, min-cost flow | | Exact Cover | solve_exact_cover | N-Queens, tiling puzzles | | Utilities | FenwickTree, UnionFind | Data structures for algorithms |


Quickstart

uv add solvor
from solvor import solve_lp, solve_tsp, dijkstra, solve_hungarian

# Linear Programming
result = solve_lp(c=[1, 2], A=[[1, 1], [2, 1]], b=[4, 5])
print(result.solution)  # optimal x

# TSP with tabu search
distances = [[0, 10, 15], [10, 0, 20], [15, 20, 0]]
result = solve_tsp(distances)
print(result.solution)  # best tour found

# Shortest path
graph = {'A': [('B', 1), ('C', 4)], 'B': [('C', 2)], 'C': []}
result = dijkstra('A', 'C', lambda n: graph.get(n, []))
print(result.solution)  # ['A', 'B', 'C']

# Assignment
costs = [[10, 5], [3, 9]]
result = solve_hungarian(costs)
print(result.solution)  # [1, 0] - row 0 gets col 1, row 1 gets col 0

Solvors

<details> <summary><strong>Linear & Integer Programming</strong></summary>

solve_lp

For resource allocation, blending, production planning. Finds the exact optimum for linear objectives with linear constraints.

# minimize 2x + 3y subject to x + y >= 4, x <= 3
result = solve_lp(
    c=[2, 3],
    A=[[-1, -1], [1, 0]],  # constraints as Ax <= b
    b=[-4, 3]
)

solve_milp

When some variables must be integers. Diet problems, scheduling with discrete slots, set covering.

# same as above, but x must be integer
result = solve_milp(c=[2, 3], A=[[-1, -1], [1, 0]], b=[-4, 3], integers=[0])

solve_cg

Column generation for problems with exponentially many variables. Cutting stock, bin packing, vehicle routing, crew scheduling.

# Cutting stock: minimize rolls to cut required pieces
result = solve_cg(
    demands=[97, 610, 395, 211],
    roll_width=100,
    piece_sizes=[45, 36, 31, 14],
)
print(result.objective)  # 454 rolls

solve_bp

Branch-and-price for optimal integer solutions. Combines column generation with branch-and-bound for proven optimality.

# Cutting stock with guaranteed integer optimality
result = solve_bp(
    demands=[97, 610, 395, 211],
    roll_width=100,
    piece_sizes=[45, 36, 31, 14],
)
print(result.objective)  # Integer optimal
</details> <details> <summary><strong>Constraint Programming</strong></summary>

solve_sat

For "is this configuration valid?" problems. Dependencies, exclusions, implications - anything that boils down to boolean constraints.

# (x1 OR x2) AND (NOT x1 OR x3) AND (NOT x2 OR NOT x3)
result = solve_sat([[1, 2], [-1, 3], [-2, -3]])
print(result.solution)  # {1: True, 2: False, 3: True}

Model

Constraint programming for puzzles and scheduling with "all different", arithmetic, and logical constraints. Sudoku, N-Queens, timetabling.

m = Model()
cells = [[m.int_var(1, 9, f'c{i}{j}') for j in range(9)] for i in range(9)]

# All different in each row
for row in cells:
    m.add(m.all_different(row))

result = m.solve()

</details> <details> <summary><strong>Metaheuristics</strong></summary>

anneal

Simulated annealing, sometimes you have to go downhill to find a higher peak.

result = anneal(
    initial=initial_solution,
    objective_fn=cost_function,
    neighbors=random_neighbor,
    temperature=1000,
    cooling=0.9995
)

tabu_search

Greedy local search with a "don't go back there" list. Simple but surprisingly effective.

result = tabu_search(
    initial=initial_solution,
    objective_fn=cost_function,
    neighbors=get_neighbors,  # returns [(move, solution), ...]
    cooldown=10
)

lns / alns

Large Neighborhood Search, destroy part of your solution and rebuild it better. ALNS learns which operators work best.

result = lns(initial, objective_fn, destroy_ops, repair_ops, max_iter=1000)
result = alns(initial, objective_fn, destroy_ops, repair_ops, max_iter=1000)  # adaptive

evolve / differential_evolution / particle_swarm

Population-based global search. Let a swarm of candidates explore for you. DE and PSO work on continuous spaces.

result = evolve(objective_fn=fitness, population=pop, crossover=cx, mutate=mut)
result = differential_evolution(objective_fn, bounds=[(0, 10)] * n, population_size=50)
result = particle_swarm(objective_fn, bounds=[(0, 10)] * n, n_particles=30)
</details> <details> <summary><strong>Continuous Optimization</strong></summary>

Gradient-Based

gradient_descent, momentum, rmsprop, adam - follow the slope. Adam adapts learning rates per parameter.

def grad_fn(x):
    return [2 * x[0], 2 * x[1]]  # gradient of x^2 + y^2

result = adam(grad_fn, x0=[5.0, 5.0])

Quasi-Newton

bfgs, lbfgs - approximate Hessian for faster convergence. L-BFGS uses limited memory.

result = bfgs(objective_fn, grad_fn, x0=[5.0, 5.0])
result = lbfgs(objective_fn, grad_fn, x0=[5.0, 5.0], memory=10)

Derivative-Free

nelder_mead, powell - no gradients needed. Good for noisy, non-smooth, or "I have no idea what this function looks like" situations.

result = nelder_mead(objective_fn, x0=[5.0, 5.0])
result = powell(objective_fn, x0=[5.0, 5.0])

bayesian_opt

When each evaluation is expensive. Builds a surrogate model to sample intelligently. Perfect for hyperparameter tuning or when your simulation takes 10 minutes per run.

result = bayesian_opt(expensive_fn, bounds=[(0, 1), (0, 1)], max_iter=30)
</details> <details> <summary><strong>Pathfinding</strong></summary>

bfs / dfs

Unweighted graph traversal. BFS finds shortest paths (fewest edges), DFS explores depth-first. Both work with any state type and neighbor function.

# Find shortest path in a grid
def neighbors(pos):
    x, y = pos
    return [(x+1, y), (x-1, y), (x, y+1), (x, y-1)]

result = bfs(start=(0, 0), goal=(5, 5), neighbors=neighbors)
print(result.solution)  # path from start to goal

dijkstra

Weighted shortest paths. Classic algorithm for when edges have costs. Stops early when goal is found.

def neighbors(node):
    return graph[node]  # returns [(neighbor, cost), ...]

result = dijkstra(start='A', goal='Z', neighbors=neighbors)

astar / astar_grid

A* with heuristics. Faster than Dijkstra when you have a good distance estimate. astar_grid handles 2D grids with built-in heuristics.

# Grid pathfinding with obstacles
grid = [[0, 0, 1, 0], [0, 0, 1, 0], [0, 0, 0, 0]]
result = astar_grid(grid, start=(0, 0), goal=(2, 3))

bellman_ford

Handles negative edge weights. Slower than Dijkstra but detects negative cycles, which is useful when costs can go negative.

result = bellman_ford(start=0, edges=[(0, 1, 5), (1, 2, -3), ...], n_nodes=4)

floyd_warshall

All-pairs shortest paths. O(n³) but gives you every shortest path at once. Worth it when you need the full picture.

result = floyd_warshall(n_nodes=4, edges=[(0, 1, 3), (1, 2, 1), ...])
# result.solution[i][j] = shortest distance from i to j
</details> <details> <summary><strong>Network Flow & MST</strong></summary>

max_flow

"How much can I push through this network?" Assigning workers to tasks, finding bottlenecks.

graph = {
    's': [('a', 10, 0), ('b', 5, 0)],
    'a': [('b', 15, 0), ('t', 10, 0)],
    'b': [('t', 10, 0)],
    't': []
}
result = max_flow(graph, 's', 't')

min_cost_flow / network_simplex

"What's the cheapest way to route X units?" Use `min_cost_f

View on GitHub
GitHub Stars27
CategoryEducation
Updated7d ago
Forks0

Languages

Python

Security Score

95/100

Audited on Mar 27, 2026

No findings