NonSmoothSolvers
Documentation for NonSmoothSolvers.
NonSmoothSolvers.GradientSampling
NonSmoothSolvers.OptimizationState
NonSmoothSolvers.OptimizerParams
NonSmoothSolvers.VUbundle
NonSmoothSolvers.VUbundleState
NonSmoothSolvers.linesearch_nsbfgs
NonSmoothSolvers.optimize!
NonSmoothSolvers.qNewtonupdate!
NonSmoothSolvers.OptimizerParams
— TypeOptimizerParams
Generic parameters for optim algs
NonSmoothSolvers.OptimizationState
— TypeOptimizationState
Stores information after one iteration of the optimizer. Generic information is stored explicitly in the struct, custom information may be stored in the field additionalinfo::NamedTuple
.
NonSmoothSolvers.optimize!
— Functionoptimize!(state, pb, optimizer, optparams, tracestrategy, inittrace) -> Tuple{Any, DataStructures.Deque}
Call the optimizer
on problem pb
, with initial point initial_x
. Returns a tuple containing the final iterate vector and a trace.
Features:
- timing of the
update_iterate
method only; - saves basic information of each iteration in a vector of
OptimizationState
, the so-called trace; - the information saved at each iterate may be enriched by the user by providing a name and callback function via the
optimstate_extension
argument.
Example
getx(o, os) = os.x
optimstate_extensions = OrderedDict{Symbol, Function}(:x => getx)
optimize!(pb, o, xclose; optparams, optimstate_extensions)
Gradient sampling
NonSmoothSolvers.GradientSampling
— TypeGradient sampling algorthm.
<!– NonSmoothSolvers.update_iterate!(state, gs::GradientSampling, pb) –>
Non smooth BFGS
NonSmoothSolvers.linesearch_nsbfgs
— Functionlinesearch_nsbfgs
Nonsmooth linesearch from Nonsmooth optimization via quasi-Newton methods, Lewis & Overton, 2013.
<!– NonSmoothSolvers.update_iterate! –>
<!– @autodocs --> <!-- Modules = [NonSmoothSolvers] --> <!-- pages = ["NonSmoothSolvers.jl", "optimize.jl", "solver_types.jl"] --> <!--
–>