Comprehensive, rigorously tested collection of nonlinear optimization test functions for Julia.
Over 200 standard benchmark problems with analytical gradients, known global minima, bounds and detailed mathematical properties.
Perfect for developers of global optimizers, local solvers, derivative-free methods, metaheuristics and gradient-based algorithms.
- Analytical gradients verified against ForwardDiff and Zygote
- Rich metadata: global minimum, recommended starting points, bounds, modality, convexity, separability, etc.
- Domain-safe box-constraint wrapper (
with_box_constraints) - Full compatibility with Optim.jl, NLopt.jl, GalacticOptim.jl, BlackBoxOptim.jl
- More than 350 automated tests covering edge cases, high-precision arithmetic and gradient accuracy
using Pkg
Pkg.add("NonlinearOptimizationTestFunctions")
using NonlinearOptimizationTestFunctions, Optim
tf = ROSENBROCK_FUNCTION
result = optimize(tf.f, tf.grad, start(tf), LBFGS())
println("Rosenbrock minimum: ", minimum(result))
Complete manual, full function reference, detailed examples, properties, testing strategy and roadmap:
https://uwealex.github.io/NonlinearOptimizationTestFunctions.jl
Legacy documentation (pre-2025) is archived here:
Legacy docs