occ
Loading...
Searching...
No Matches
occ::mults::TrustRegion Class Reference

Trust Region Newton optimizer with Steihaug-CG subproblem solver. More...

#include <trust_region.h>

Public Types

using Objective = std::function< std::pair< double, Vec >(const Vec &)>
 Objective function: returns (value, gradient)
 
using HessianFunc = std::function< Mat(const Vec &)>
 Hessian function: returns Hessian matrix.
 
using HessianVectorProduct = std::function< Vec(const Vec &, const Vec &)>
 Hessian-vector product: returns H*v (more efficient for large problems)
 
using Callback = std::function< bool(int, const Vec &, double, const Vec &)>
 Iteration callback: (iteration, x, value, gradient) -> continue?
 
using GradObjective = std::function< double(const Vec &, Vec &)>
 Gradient-only objective: f(x, grad) -> energy (same signature as LBFGS/MSTMIN)
 

Public Member Functions

 TrustRegion (const TrustRegionSettings &settings={})
 
TrustRegionResult minimize (Objective objective, HessianFunc hessian, const Vec &x0, Callback callback=nullptr)
 Minimize using explicit Hessian matrix.
 
TrustRegionResult minimize_hvp (Objective objective, HessianVectorProduct hvp, const Vec &x0)
 Minimize using Hessian-vector products (matrix-free).
 
TrustRegionResult minimize_bfgs (GradObjective f, const Vec &x0, Callback callback=nullptr)
 Minimize using BFGS Hessian approximation (gradient-only).
 

Detailed Description

Trust Region Newton optimizer with Steihaug-CG subproblem solver.

This is a robust second-order optimization method that:

  • Naturally handles indefinite Hessians (negative curvature)
  • Uses analytical Hessians when available
  • Solves the trust region subproblem using Steihaug's truncated CG

The trust region subproblem is: min_p g^T p + 0.5 p^T H p subject to ||p|| <= delta

Usage:

TrustRegion optimizer(settings);
auto result = optimizer.minimize(objective, hessian, x0);
Trust Region Newton optimizer with Steihaug-CG subproblem solver.
Definition trust_region.h:63

Member Typedef Documentation

◆ Callback

using occ::mults::TrustRegion::Callback = std::function<bool(int, const Vec&, double, const Vec&)>

Iteration callback: (iteration, x, value, gradient) -> continue?

◆ GradObjective

using occ::mults::TrustRegion::GradObjective = std::function<double(const Vec&, Vec&)>

Gradient-only objective: f(x, grad) -> energy (same signature as LBFGS/MSTMIN)

◆ HessianFunc

using occ::mults::TrustRegion::HessianFunc = std::function<Mat(const Vec&)>

Hessian function: returns Hessian matrix.

◆ HessianVectorProduct

using occ::mults::TrustRegion::HessianVectorProduct = std::function<Vec(const Vec&, const Vec&)>

Hessian-vector product: returns H*v (more efficient for large problems)

◆ Objective

using occ::mults::TrustRegion::Objective = std::function<std::pair<double, Vec>(const Vec&)>

Objective function: returns (value, gradient)

Constructor & Destructor Documentation

◆ TrustRegion()

occ::mults::TrustRegion::TrustRegion ( const TrustRegionSettings settings = {})
explicit

Member Function Documentation

◆ minimize()

TrustRegionResult occ::mults::TrustRegion::minimize ( Objective  objective,
HessianFunc  hessian,
const Vec x0,
Callback  callback = nullptr 
)

Minimize using explicit Hessian matrix.

Parameters
objectiveFunction returning (value, gradient)
hessianFunction returning Hessian matrix
x0Initial parameters
callbackOptional iteration callback (return false to stop)
Returns
Optimization result

◆ minimize_bfgs()

TrustRegionResult occ::mults::TrustRegion::minimize_bfgs ( GradObjective  f,
const Vec x0,
Callback  callback = nullptr 
)

Minimize using BFGS Hessian approximation (gradient-only).

Starts from identity Hessian, applies BFGS rank-2 updates each accepted step. No analytic Hessian needed — same interface as LBFGS.

Parameters
fGradient-only objective: f(x, grad) -> energy
x0Initial parameters
callbackOptional iteration callback
Returns
Optimization result

◆ minimize_hvp()

TrustRegionResult occ::mults::TrustRegion::minimize_hvp ( Objective  objective,
HessianVectorProduct  hvp,
const Vec x0 
)

Minimize using Hessian-vector products (matrix-free).

Parameters
objectiveFunction returning (value, gradient)
hvpFunction returning H*v given x and v
x0Initial parameters
Returns
Optimization result

The documentation for this class was generated from the following file: